Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update dependencies #14

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Conversation

MostlyKIGuess
Copy link
Member

  • security fixes

@chimosky
Copy link
Member

Did you test your changes to be sure nothing breaks with these versions?

@MostlyKIGuess
Copy link
Member Author

Did you test your changes to be sure nothing breaks with these versions?

while checking I realized , most of them are not needed, I only kept the necessary ones , will push the commit for that. I saw the mail and updated dependencies, want to fix this before this tuesday.

- add command-line interface;
-include unit test for functionality
@MostlyKIGuess
Copy link
Member Author

Welp the master branch wasn't working, so I made a lot of changes, almost revamping. But shouldn't we just merge the documents of this along with https://sugar-docs-ai.streamlit.app/. But I am willling to update this as well. I just saw the mail regarding dep issues and thought would fix it.

@MostlyKIGuess
Copy link
Member Author

issue with current master is that ollama and httpx can't be run simultaneity they break each other's dependencies.

@chimosky
Copy link
Member

issue with current master is that ollama and httpx can't be run simultaneity they break each other's dependencies.

Are you referring to aiohttp and ollama or langchain-ollama?

Also @kshitijdshah99 why do we have two ollama dependencies?

@MostlyKIGuess
Copy link
Member Author

issue with current master is that ollama and httpx can't be run simultaneity they break each other's dependencies.

Are you referring to aiohttp and ollama or langchain-ollama?

Also @kshitijdshah99 why do we have two ollama dependencies?

ollama and httpx, check ollama/ollama-python#356

@kshitijdshah99
Copy link
Contributor

issue with current master is that ollama and httpx can't be run simultaneity they break each other's dependencies.

Are you referring to aiohttp and ollama or langchain-ollama?

Also @kshitijdshah99 why do we have two ollama dependencies?

Both these dependencies serve different purpose. langchain-ollama is a package which provides interaction between Langchain and Ollama to build Langchain workflows. On other hand ollama is for extracting and managing different models like llama3.1, Mistral or Codellama we downloaded locally on our system.

@MostlyKIGuess
Copy link
Member Author

issue with current master is that ollama and httpx can't be run simultaneity they break each other's dependencies.

Are you referring to aiohttp and ollama or langchain-ollama?

Also @kshitijdshah99 why do we have two ollama dependencies?

Both these dependencies serve different purpose. langchain-ollama is a package which provides interaction between Langchain and Ollama to build Langchain workflows. On other hand ollama is for extracting and managing different models like llama3.1, Mistral or Codellama we downloaded locally on our system.

We should just pull from HF instead this way we can remove ollama. What say?

@kshitijdshah99
Copy link
Contributor

I'll suggest if @chimosky agrees we can try solving this issue first instead of directly skipping to HuggingFace because then we would have to adjust many code files.

@chimosky
Copy link
Member

I'll suggest if @chimosky agrees we can try solving this issue first instead of directly skipping to HuggingFace because then we would have to adjust many code files.

If we can do without a dependency, then we should.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants