You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
"In the current PR #45, which fixes issue #44 and it is also currently checked out as the current branch, there isn't duplication of the checks are there? In your writeup you say that "added pre-push hook with the same validation". It seems that we have both a pre-commit hook and a pre-push hook that do the same thing? Won't that slow things down?"
"CAn you implement github issue #31 and also #30 and do a combined PR back to Github?"
"CAn you execute issue 32 and make a PR for it back to github?"
"CAn you create a Github issue for removing the base tsconfig.json file and instead just using fully defined tsconfig.json files in each package in both the packages folder and the services folder? Complex tsconfig.json strategies with shared settings can introduce a lot of unnecessary complexity. Can you also make a Github issue for combining all packages into a single packages folder rather than having them split between packages and services? There isn't enough packages to warrant the split here. Third can you create an issue for updating the root README.md so that it describes the project, what each package does and the main ways developers (and agentic agents) should interact with it?"
"an you implement the recommendations 2 and 3 from issue #44. You can look at the CI Github Actions workflow in ../mycoder-websites/.github as guide to setting up a similar CI action that validates the build and runs lint, etc for this repo."
"Can you make the blue that is used for the links to be a little more dark-grey blue? And can you remove the underline from links by default? Please create a Github issue for this and a PR."
"Can you confirm the project builds? I am not sure it does."
"I think that the github action workflows and maybe the docker build are still making assumptions about using npm rather than pnpm. Can you look at ../Business/drivecore/mycoder-websites as an example of docker files that use pnpm and also github action workflows that use pnpm and adapt the current project to use that style. Pleaes create a github issue and then once the task is complete please submit a PR."
"IT seems that the latest GitHub action failed, can you investigate it and make a GitHub issue with the problem and the push PR that fixes the issue? Please wait for the new GitHub action to complete before declaring success."
"You just created PR #34 which fixes issues #30 and #31. But the CI is failing, you can check the Github Actions to see why. Can you address the issue? I do worry that your recent reorganization of the repo may not align with some of the assumptions in the Github action workflows or maybe the docker files or something. Anyhow, please have a look and if you can fix it, please do. If the issue is unrelated to the two issues you fixed in the recent PR, then make a Github issue to capture the problem and make a separate PR to fix it."
"When I run this command "pnpm --filter @web3dsurvey/api-server build" in the current directory, it runs into an error because one of the packages in this mono-repo upon which @web3dsurvey/api-server is dependent is not built, but I am confused because I thought that pnpm would automatically build packages that are depended upon. I must have some part of the configuration of the current project incorrect right? Can you create an issue for this and the investigate. You can use the command "pnpm clean:dist" to reset the package to its non-built state."
"REcently this project was converted from using the Anthropic SDK directly to using the Vercel AI SDK. . Since then it has created reliability problems. That change was made 4 days ago in this PR: https://github.com/drivecore/mycoder/pull/55/files And it was built upon by adding support for ollama, grok/xai and openai in subsequent PRs. I wouldl like to back out the adoption of the Vercel AI SDK, both the 'ai' npm library as well as the '@AI-SDK' npm libraries and thus also back out support for Ollama, OpenAI and Grok. In the future I will add back these but the Vercel AI SDK is not working well. While we back this out I would like to, as we re-implement using the Anthropic SDK, I would like to keep some level of abstraction around the specific LLM. . Thus I would like to have our own Message type and it should have system, user, assistant, tool_use, tool_result sub-types with their respective fields. We can base it on the Vercel AI SDK. And then we should implement a generic generateText() type that takes messages and the tools and other standard LLM settings and returns a new set of messages - just as anthropic's SDK does. We can have an Anthropic-specific function that takes the API key + the model and returns a generateText() function that meets the generic type. Thus we can isolate the Anthropic specific code from the rest of the application making it easier to support other models in the future. The anthropic specific implementation of generateText will have to convert from the generic messages to anthropics specific type of messages and after text completion, it will need to convert back. This shouldn't be too involved. We can skip token caching on the first go around, but lets create both an issue for this main conversion I've described as well as follow on issues to add token caching as well as OpenAI and Ollama support. You can check out old branches of the code here if that helps you analyze the code to understand. I would like a plan of implementation as a comment on the first issue - the main conversion away from Vercel AI SDK."
The text was updated successfully, but these errors were encountered:
"In the current PR #45, which fixes issue #44 and it is also currently checked out as the current branch, there isn't duplication of the checks are there? In your writeup you say that "added pre-push hook with the same validation". It seems that we have both a pre-commit hook and a pre-push hook that do the same thing? Won't that slow things down?"
"CAn you implement github issue #31 and also #30 and do a combined PR back to Github?"
"CAn you execute issue 32 and make a PR for it back to github?"
"CAn you create a Github issue for removing the base tsconfig.json file and instead just using fully defined tsconfig.json files in each package in both the packages folder and the services folder? Complex tsconfig.json strategies with shared settings can introduce a lot of unnecessary complexity. Can you also make a Github issue for combining all packages into a single packages folder rather than having them split between packages and services? There isn't enough packages to warrant the split here. Third can you create an issue for updating the root README.md so that it describes the project, what each package does and the main ways developers (and agentic agents) should interact with it?"
"an you implement the recommendations 2 and 3 from issue #44. You can look at the CI Github Actions workflow in ../mycoder-websites/.github as guide to setting up a similar CI action that validates the build and runs lint, etc for this repo."
"Can you make the blue that is used for the links to be a little more dark-grey blue? And can you remove the underline from links by default? Please create a Github issue for this and a PR."
"Can you confirm the project builds? I am not sure it does."
"I think that the github action workflows and maybe the docker build are still making assumptions about using npm rather than pnpm. Can you look at ../Business/drivecore/mycoder-websites as an example of docker files that use pnpm and also github action workflows that use pnpm and adapt the current project to use that style. Pleaes create a github issue and then once the task is complete please submit a PR."
"IT seems that the latest GitHub action failed, can you investigate it and make a GitHub issue with the problem and the push PR that fixes the issue? Please wait for the new GitHub action to complete before declaring success."
"You just created PR #34 which fixes issues #30 and #31. But the CI is failing, you can check the Github Actions to see why. Can you address the issue? I do worry that your recent reorganization of the repo may not align with some of the assumptions in the Github action workflows or maybe the docker files or something. Anyhow, please have a look and if you can fix it, please do. If the issue is unrelated to the two issues you fixed in the recent PR, then make a Github issue to capture the problem and make a separate PR to fix it."
"When I run this command "pnpm --filter @web3dsurvey/api-server build" in the current directory, it runs into an error because one of the packages in this mono-repo upon which @web3dsurvey/api-server is dependent is not built, but I am confused because I thought that pnpm would automatically build packages that are depended upon. I must have some part of the configuration of the current project incorrect right? Can you create an issue for this and the investigate. You can use the command "pnpm clean:dist" to reset the package to its non-built state."
"REcently this project was converted from using the Anthropic SDK directly to using the Vercel AI SDK. . Since then it has created reliability problems. That change was made 4 days ago in this PR: https://github.com/drivecore/mycoder/pull/55/files And it was built upon by adding support for ollama, grok/xai and openai in subsequent PRs. I wouldl like to back out the adoption of the Vercel AI SDK, both the 'ai' npm library as well as the '@AI-SDK' npm libraries and thus also back out support for Ollama, OpenAI and Grok. In the future I will add back these but the Vercel AI SDK is not working well. While we back this out I would like to, as we re-implement using the Anthropic SDK, I would like to keep some level of abstraction around the specific LLM. . Thus I would like to have our own Message type and it should have system, user, assistant, tool_use, tool_result sub-types with their respective fields. We can base it on the Vercel AI SDK. And then we should implement a generic generateText() type that takes messages and the tools and other standard LLM settings and returns a new set of messages - just as anthropic's SDK does. We can have an Anthropic-specific function that takes the API key + the model and returns a generateText() function that meets the generic type. Thus we can isolate the Anthropic specific code from the rest of the application making it easier to support other models in the future. The anthropic specific implementation of generateText will have to convert from the generic messages to anthropics specific type of messages and after text completion, it will need to convert back. This shouldn't be too involved. We can skip token caching on the first go around, but lets create both an issue for this main conversion I've described as well as follow on issues to add token caching as well as OpenAI and Ollama support. You can check out old branches of the code here if that helps you analyze the code to understand. I would like a plan of implementation as a comment on the first issue - the main conversion away from Vercel AI SDK."
The text was updated successfully, but these errors were encountered: