You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The whole project now is based on container structure using docker which is great for portability, but it would be great to have support for windows and linux systems without needing docker to have better access for code debugging, hardware and testing, and to have support for other LLMs like mentioned (#64 (comment)) and especially for LM Studio as it has access to a wide range of open source models via hugging face and it will cost 0$.
I know that I just have to install docker, but having a full windows support would be
Potential Solutions
I've noticed the extensive use of linux commands which I'm not sure if they will all be available in windows, but I think we can work this arround by checking the platform using a python library or changing commands to python code ( which should automatically detect the platform ).
The text was updated successfully, but these errors were encountered:
While you might not need the docker container to run run.py (Windows support is probably ready soon after #17 is ready), the commands will always be executed in a docker container that runs Linux.
To support more models, I think #64 (litellm) makes more sense.
Generally, it's unlikely that local models are good enough to solve the issues at the current state of art.
I totally agree, but as model architecture progresses we might be able to execute the 70B models locally, and we might start seeing models that are fine-tuned for swe-agents or similar platforms like devika which might perform great
Describe the feature
The whole project now is based on container structure using docker which is great for portability, but it would be great to have support for windows and linux systems without needing docker to have better access for code debugging, hardware and testing, and to have support for other LLMs like mentioned (#64 (comment)) and especially for LM Studio as it has access to a wide range of open source models via hugging face and it will cost 0$.
I know that I just have to install docker, but having a full windows support would be
Potential Solutions
I've noticed the extensive use of linux commands which I'm not sure if they will all be available in windows, but I think we can work this arround by checking the platform using a python library or changing commands to python code ( which should automatically detect the platform ).
The text was updated successfully, but these errors were encountered: