Microsoft this week demonstrated Copilot technologies, which make managing Azure cloud services easier, as well as a tool that makes developing and deploying artificial intelligence (AI) apps on the Azure platform more efficient, at its Ignite 2023 conference.
Additionally, Microsoft introduced Microsoft Copilot Studio, a low-code application that streamlines the process of developing bespoke copilots and data integration plugins for the Microsoft Copilot for Microsoft 365 tool that Microsoft had previously introduced.
By utilising large language models (LLMs), Microsoft Copilot for Azure facilitates IT teams in creating, configuring, finding, and debugging Azure services using natural language. IT teams can also use it to formulate sophisticated instructions, pose queries, and reduce expenses.
Microsoft’s corporate vice president for Azure Core, Erin Chapple, said Ignite participants that Microsoft Copilot Azure is already being used by Microsoft and a few of its customers to manage Azure infrastructure.
Using Azure AI Studio, a framework for calling the AI models that Microsoft makes accessible on the Azure cloud, to build and deploy AI apps more quickly is clearly Microsoft’s long-term strategy. Ultimately, the aim is to enable enterprises to build their own copilots using trained AI models.
While building apps using AI models is still in its infancy, it is already clear that DevOps and machine learning operations (MLOps), best practises for cybersecurity and data engineering must also converge. Microsoft is arguing that the architecture that will allow IT organisations to accomplish that is Azure AI Studio.
Though there are other IT infrastructure resource providers with same goals, Microsoft is the most advanced in providing a framework for creating large-scale AI applications because of its investments in OpenAI and acquisition of GitHub. In order to assist developers in writing code that uses generative AI to automatically propose an editable plan for building an application based on natural language descriptions typed into the GitHub Issues project management software, GitHub last week unveiled a preview of an extension of the Copilot tools it already offers. With just one click, Copilot Workspace can produce editable documents that developers can use to write code that they can later review visually. It is also possible for the Copilot Workspace platform or application developers to automatically repair any faults they find.
In the meantime, GitHub has expanded Copilot Chat’s functionality to help developers find problems in their code base more easily by using natural language.
Applications are being produced at a much faster pace thanks to generative AI, but code still needs to be vetted. A general-purpose large language model (LLM) that is trained by ingesting code of various quality levels from the internet serves as the foundation for Chat GPT. Consequently, the code produced by the platform may be inefficient or vulnerable. Professional developers still choose to build their own code in many circumstances.
Naturally, not all programming tasks call for the same degree of coding proficiency. ChatGPT will frequently produce scripts, for example, that can be confidently reused throughout a DevOps workflow. Thanks to technologies like GitHub Copilot, there is no shortage of mediocre people who are suddenly creating better code, and soon, domain-specific LLMs will allow writing better code regularly based on verified code samples.
The next task will be to figure out how to handle larger code volumes. AI will undoubtedly be used to manage DevOps pipelines, but for the time being at least, the rate at which AI is being used to write code is faster than what DevOps teams can handle.