OS-Copilot (open interpreter)
OS-Copilot
https://github.com/OS-Copilot/OS-Copilot
An self-improving embodied conversational agent seamlessly integrated into the operating system to automate our daily tasks.
OS-Copilot is an open-source library to build generalist agents capable of automatically interfacing with comprehensive elements in an operating system (OS), including the web, code terminals, files, multimedia, and various third-party applications.
阿里
https://help.aliyun.com/zh/alinux/user-guide/instructions-for-os-copilot
https://developer.aliyun.com/article/1569065#:~:text=OS%20Copil
https://openanolis.cn/blog/detail/1129965695296482635#:~:text=%E4%B8%BA%E4%BA%86%E5%BA%94%E5%AF%B9%E8%BF%99%E4%BA%9B%E6%8C%91%E6%88%98
功能介绍
命令行自然语言问答
通过命令行自然语言问答,OS Copilot可以让用户直接在OS的命令行中输入在使用中的自然语言表述的问题,帮助用户回答日常及操作系统领域相关信息,简化用户需要切换到浏览器搜索的步骤,降低OS使用成本及使用连贯性,提高日常OS使用效率和用户体验。OS Copilot通过后端的LLM大模型,可以让用户轻松地接触到阿里云操作系统团队多年积累的系统领域知识,让用户可以更精准地搜索到操作系统相关信息,特别是对于阿里云自研操作系统Alibaba Cloud Linux及阿里云主导开源社区龙蜥操作系统AnolisOS的相关知识。
辅助命令执行
不同于Windows可视化的桌面操作系统,Linux操作系统主要通过命令行来进行操作系统日常的使用及维护等操作。而复杂繁多的命令行,对于Linux小白用户来说是入门Linux的第一道门槛。OS Copilot提供辅助命令执行功能,让Linux的小白用户也能迅速上手简单的Linux的使用。
阿里云CLI调用
阿里云CLI(Command Line Interface)是基于阿里云开放API建立的管理工具。借助此工具,您可以通过调用阿里云开放API来管理阿里云产品。OS Copilot支持让用户在操作系统内完成阿里云CLI的调用,通过命令行简单进行ECS信息及实例ID等查询。
系统运维和调优
在操作系统使用中的重要场景是系统的运维和调优,操作系统的表现对于业务有着非常大的影响。虽然市面上有很多的相关运维和调优产品,但是系统的运维和调优需要相关人员有相关Linux内核专业知识的积累及长期的问题处理经验,而具备相关能力的人才是非常有限的。Copilot可以通过自然语言调用相关系统运维及调优工具,特别是阿里云自研的系统工具,帮助用户轻松使用系统工具定位系统问题,提升系统性能表现。
open-interpreter
https://github.com/OpenInterpreter/open-interpreter
Open Interpreter lets LLMs run code (Python, Javascript, Shell, and more) locally. You can chat with Open Interpreter through a ChatGPT-like interface in your terminal by running
$ interpreter
after installing.This provides a natural-language interface to your computer's general-purpose capabilities:
- Create and edit photos, videos, PDFs, etc.
- Control a Chrome browser to perform research
- Plot, clean, and analyze large datasets
- ...etc.
⚠️ Note: You'll be asked to approve code before it's run.
实践
https://docs.openinterpreter.com/getting-started/introduction
https://docs.openinterpreter.com/language-models/local-models/lm-studio
interactive_quickstart.py
# This is all you need to get started from interpreter import interpreter interpreter.offline = True interpreter.llm.model = "openai/local" # Tells OI to use an OpenAI-compatible server interpreter.llm.api_key = "dummy_key" interpreter.llm.api_base = "http://172.20.160.1:1234/v1" # interpreter.llm.model = "lmstudio-community/Meta-Llama-3.1-8B-Instruct-GGUF/Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" interpreter.llm.context_window = 7000 interpreter.llm.max_tokens = 1000 interpreter.llm.supports_functions = False interpreter.chat()
https://docs.openinterpreter.com/settings/all-settings#local-mode
from interpreter import interpreter interpreter.offline = True # Disables online features like Open Procedures interpreter.llm.model = "openai/x" # Tells OI to send messages in OpenAI's format interpreter.llm.api_key = "fake_key" # LiteLLM, which we use to talk to local models, requires this interpreter.llm.api_base = "http://localhost:1234/v1" # Point this at any OpenAI compatible server interpreter.chat()
zhipu
# This is all you need to get started from interpreter import interpreter interpreter.offline = True interpreter.llm.model = "openai/glm-4" # Tells OI to use an OpenAI-compatible server interpreter.llm.api_key = "XXXXXX" interpreter.llm.api_base = "https://open.bigmodel.cn/api/paas/v4/" # interpreter.llm.model = "lmstudio-community/Meta-Llama-3.1-8B-Instruct-GGUF/Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" interpreter.llm.context_window = 7000 interpreter.llm.max_tokens = 1000 interpreter.llm.supports_functions = False interpreter.chat()
prerequisite:
enter direcotry:
/home/song/workspace/open-interpreter/examples
task:
please list all files in the current directory.
copy the files which name contains "local" word in the current work directory to the target directory "/home/song/workspace/copiedlocal".