Artificial intelligence is the new electricity. As electricity revolutionized the world and transformed nearly every industry, so AI is poised to radically reshape the global economy and our place in it.
However, in order for the long tail of global enterprises to leverage this emerging technology, practitioners must standardize and share best practices for building, maintaining, and improving AI systems overtime. Like Agile for engineering, MLOps (or “machine learning operations”) supports AI systems through shared development practices that scale within and across organizations.
We recently teamed up with experts at AWS, Sequoia Capital, and Madrona Venture Group to write this whitepaper on MLOps and how it relates to startups and building new companies.
Inside we talk about the state of MLOps, including:
- What are the paths to MLOps Success?
- What is MLOps?
- What is the data-centric approach to MLOps?
- Will MLOps converge to a modular architecture?
- Why does MLOps need open-source software development?
- How does AWS support open-source machine learning?
- Why does MLOps need vendors as well as open-source software?
- How some of the leading investors in the AI space are thinking about MLOps
Also included are perspectives on AI Fund portfolio companies, Landing AI, WhyLabs, and ValidMind.