https://github.com/microsoft/LMOps Skip to content Toggle navigation Sign up * Product + Actions Automate any workflow + Packages Host and manage packages + Security Find and fix vulnerabilities + Codespaces Instant dev environments + Copilot Write better code with AI + Code review Manage code changes + Issues Plan and track work + Discussions Collaborate outside of code + Explore + All features + Documentation + GitHub Skills + Blog * Solutions + For + Enterprise + Teams + Startups + Education + By Solution + CI/CD & Automation + DevOps + DevSecOps + Case Studies + Customer Stories + Resources * Open Source + GitHub Sponsors Fund open source developers + The ReadME Project GitHub community articles + Repositories + Topics + Trending + Collections * Pricing [ ] * # In this repository All GitHub | Jump to | * No suggested jump to results * # In this repository All GitHub | Jump to | * # In this organization All GitHub | Jump to | * # In this repository All GitHub | Jump to | Sign in Sign up {{ message }} microsoft / LMOps Public * Notifications * Fork 10 * Star 325 General technology for enabling AI capabilities w/ LLMs and Generative AI models aka.ms/nlpagi License MIT license 325 stars 10 forks Star Notifications * Code * Issues 2 * Pull requests 0 * Actions * Projects 0 * Security * Insights More * Code * Issues * Pull requests * Actions * Projects * Security * Insights microsoft/LMOps This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository. main Switch branches/tags [ ] Branches Tags Could not load branches Nothing to show {{ refName }} default View all branches Could not load tags Nothing to show {{ refName }} default View all tags Name already in use A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch? Cancel Create 1 branch 0 tags Code * Local * Codespaces * Clone HTTPS GitHub CLI [https://github.com/m] Use Git or checkout with SVN using the web URL. [gh repo clone micros] Work fast with our official CLI. Learn more. * Open with GitHub Desktop * Download ZIP Sign In Required Please sign in to use Codespaces. Launching GitHub Desktop If nothing happens, download GitHub Desktop and try again. Launching GitHub Desktop If nothing happens, download GitHub Desktop and try again. Launching Xcode If nothing happens, download Xcode and try again. Launching Visual Studio Code Your codespace will open once ready. There was a problem preparing your codespace, please try again. Latest commit @YRdddream YRdddream Update ... 763483b Jan 11, 2023 Update 763483b Git stats * 41 commits Files Permalink Failed to load latest commit information. Type Name Latest commit message Commit time promptist Update README.md Dec 20, 2022 structured_prompting Update Jan 11, 2023 .gitignore Initial commit Dec 13, 2022 CODE_OF_CONDUCT.md CODE_OF_CONDUCT.md committed Dec 13, 2022 LICENSE LICENSE committed Dec 13, 2022 README.md Update README.md Dec 28, 2022 SECURITY.md SECURITY.md committed Dec 13, 2022 SUPPORT.md SUPPORT.md committed Dec 13, 2022 View code [ ] LMOps Links News Prompt Intelligence Promptist: reinforcement learning for automatic prompt optimization Structured Prompting: consume long-sequence prompts in an efficient way X-Prompt: extensible prompts beyond NL for descriptive instructions Fundamental Understanding of LLMs Understanding In-Context Learning Hiring: aka.ms/nlpagi License Contact Information README.md LMOps LMOps is a research initiative on fundamental research and technology for building AI products w/ foundation models, especially on the general technology for enabling AI capabilities w/ LLMs and Generative AI models. * Better Prompts: Promptist, Extensible prompts * Longer Context: Structured prompting, Length-Extrapolatable Transformers * Knowledge Augmentation (TBA) * Fundamentals Links * microsoft/unilm: Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities * microsoft/torchscale: Transformers at (any) Scale News * [Paper Release] Dec, 2022: Why Can GPT Learn In-Context? Language Models Secretly Perform Finetuning as Meta Optimizers * [Paper&Model&Demo Release] Dec, 2022: Optimizing Prompts for Text-to-Image Generation * [Paper&Code Release] Dec, 2022: Structured Prompting: Scaling In-Context Learning to 1,000 Examples * [Paper Release] Nov, 2022: Extensible Prompts for Language Models Prompt Intelligence Advanced technologies facilitating prompting language models. Promptist: reinforcement learning for automatic prompt optimization [Paper] Optimizing Prompts for Text-to-Image Generation + Language models serve as a prompt interface that optimizes user input into model-preferred prompts. + Learn a language model for automatic prompt optimization via reinforcement learning. image Structured Prompting: consume long-sequence prompts in an efficient way [Paper] Structured Prompting: Scaling In-Context Learning to 1,000 Examples * Example use cases: 1. Prepend (many) retrieved (long) documents as context in GPT. 2. Scale in-context learning to many demonstration examples. image X-Prompt: extensible prompts beyond NL for descriptive instructions [Paper] Extensible Prompts for Language Models + Extensible interface allowing prompting LLMs beyond natural language for fine-grain specifications + Context-guided imaginary word learning for general usability Extensible Prompts for Language Models Fundamental Understanding of LLMs Understanding In-Context Learning [Paper] Why Can GPT Learn In-Context? Language Models Secretly Perform Finetuning as Meta Optimizers + According to the demonstration examples, GPT produces meta gradients for In-Context Learning (ICL) through forward computation. ICL works by applying these meta gradients to the model through attention. + The meta optimization process of ICL shares a dual view with finetuning that explicitly updates the model parameters with back-propagated gradients. + We can translate optimization algorithms (such as SGD with Momentum) to their corresponding Transformer architectures. image Hiring: aka.ms/nlpagi We are hiring at all levels (including FTE researchers and interns)! If you are interested in working with us on Foundation Models (aka large-scale pre-trained models) and AGI, NLP, MT, Speech, Document AI and Multimodal AI, please send your resume to fuwei@microsoft.com. License This project is licensed under the license found in the LICENSE file in the root directory of this source tree. Microsoft Open Source Code of Conduct Contact Information For help or issues using the pre-trained models, please submit a GitHub issue. For other communications, please contact Furu Wei (fuwei@microsoft.com). About General technology for enabling AI capabilities w/ LLMs and Generative AI models aka.ms/nlpagi Topics nlp prompt agi lm gpt language-model pretraining llm promptist x-prompt lmops Resources Readme License MIT license Code of conduct Code of conduct Security policy Security policy Stars 325 stars Watchers 17 watching Forks 10 forks Releases No releases published Packages 0 No packages published Contributors 7 * @donglixp * @gitnlp * @microsoftopensource * @YRdddream * @getao * @eltociear * @microsoft-github-operations[bot] Languages * Python 95.0% * Shell 3.3% * Cuda 0.9% * C++ 0.5% * Cython 0.2% * Lua 0.1% Footer (c) 2023 GitHub, Inc. Footer navigation * Terms * Privacy * Security * Status * Docs * Contact GitHub * Pricing * API * Training * Blog * About You can't perform that action at this time. You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.