This is the Trace Id: 83544efda86be51b288aefdbe185a430
Skip to main content
Investor Relations

Jefferies Software Conference

Wednesday, June 6, 2024 
Ryan Roslansky, CEO, LinkedIn

Transcript

icon_word

Who: Jessica Hawk, CVP, Company, Data, AI and Digital Apps
Event: Jefferies Software Conference
Date: May 29, 2024

Joe Gallo, Analyst, Jefferies:
All right. We’re going to get started. I’m Joe Gallo. And today, we’re delighted to host Jessica Hawk, Corporate Vice President of Data, AI and Digital Apps. She’s been at Microsoft for three years, I believe. She previously had co-founded Capax Global, a national system integrator and one of the few super partners with Microsoft. But her role in all things data and AI makes her probably the most important or interesting person at this conference as well as Microsoft. So thank you very much. And also thank you to Kendra and James for joining us in Investor Relations as well.

Maybe to kick things off, you’ve been at Microsoft three years, which I think is interesting because most people at Microsoft have been there decades. And so you have a fresh perspective and you also co-founded Capax. Can you just give us a sense of where most of your time is being spent and what your biggest focus areas are now?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing:

Sure. Good morning, everybody. I spent most of my – so product marketing is done differently at every organization around the world for sure. At Microsoft, it is very much a product strategy role. So we spend a ton of time assessing market conditions, working with our engineering teams to find product market fit, doing a lot with sales enablement. So less on the – some of those I think when people hear marketing, they start thinking about sort of the traditional campaign development and asset creation, imagery. We have a sister org that does that. My role is to run the product strategy for most of Azure. And Azure is organized in three distinct areas: infra, data and AI and digital applications and innovations. And so I have the latter two and I partner with others in the company to focus on infra, which is where you would think about things like Windows VMs spinning in Azure.

Joe Gallo, Analyst, Jefferies:

Awesome. Maybe before we dig into data and AI specifically, can you just remind investors how we should think about Microsoft and their AI strategy as a whole?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

Well, the timing for this event was spectacular because if you’re watching us on LinkedIn at all, we had our major conference last week in Seattle called Microsoft Build. And I think Satya did a really nice job of laying out the three ways to think about Microsoft’s plans with regards to our AI strategy. So we talked about the Copilot+ PCs, Copilot and the Copilot stack. And so I focus on the Copilot stack. So it’s everything from the AI infrastructure that is making all of this possible to the data and AI tools that our customers are taking advantage of. So think less Microsoft Copilot, the Copilot experience you see in Microsoft Word, for example. Think more what customers are doing with the AI infrastructure and their data and their apps layer and the developer tools that we make available to the market, things like GitHub, Visual Studio, VS Code. That’s the Copilot stack.

So it’s truly, where customers are taking advantage of the same platform that is serving the Microsoft Copilot. They can go build their own Copilot experiences. And so that’s really, where we spent most of our time at the event. And when you think about the strategy, it’s very much – we are in a fantastic position because we are able to orchestrate across all three of those circles, if you will, and create these connected experiences that can run from a PC experience or a mobile experience to what’s happening in the apps that we deliver around the world in our productivity suite to what our customers can then go either extend from the productivity suite or build new for their own customer employee experiences.

Joe Gallo, Analyst, Jefferies:

That’s a good segue, your Build was last week. So for us, investors, what were the most exciting announcements and what should we be focused on?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

We had a lot, over 50, and I’m telling you that was a – it was a short list of all of the things that we shipped across the business. But let’s go with 50. So my role is to really work with Satya and his team on his keynote and then Scott Guthrie and his team on his keynote. And so I would – Satya’s keynote is a great way to go look at what we would say are sort of the top items, of which I didn’t count. I would estimate maybe about 50% or more of them were related to the Copilot stack.

And then on day two, Scott took the world through the entire Copilot stack. And so there are a couple of things that I would say sort of broke through. First of all, the concept of Copilot stack, I would say, is one. We’ve been describing this concept since last year’s Build. And at the time, going back to last year, what we’re really trying to do is help software developers, because that is our Build conference audience, its software developers primarily. Just think through, okay, you already know how to build applications, you already know how to build software.

What is different or new about building applications that have Gen AI built into them? So if you’re at all technical that software developers work off of a solution architecture diagram and the Copilot stack was meant to at least help them understand what are the boxes that I need to now go think about when I’m building an app that’s going to have Gen AI in it. And then what we did was on day two, Scott took the audience and our online audience through okay, you understand the conceptual framework, you understand the three circles of the overall AI strategy.

Now let’s talk about the products that we’re making available within each layer of the Copilot stack. So customers can move as quickly as they want. Everybody is on an AI transformation mission right now. And so just understanding crisply and clearly, what is Microsoft offering me, what are the tools and products and services I can use to go build these apps, that’s what we did on the Copilot stack.

And so some of the top things that pop through beyond just the concept of what the stack is and how companies can think about using it, the Azure AI Studio went GA. And think of that as the single destination for developers and customers and partners to manage the Gen AI specific part of the app layer. It’s not meant to replace GitHub Copilot or Visual Studio or any of the other developer spaces; those are all still very real and critical. The Azure AI Studio allows our customers to take a foundational model and tune it for their own internal use and apply safety controls to it.

And so it’s a very specific to Gen AI set of work that developers have to do, and we’re excited to bring it to GA to give not only developers a reliable place to go do that work but also find a plane to inspect the application of content safety, our Azure AI content safety service, which is where the responsible AI tooling shows up. Not only do developers need that but their decision makers and the leaders within their organizations need that. And we understand every customer is thinking about how to do this safely and responsibly. And it gives them that single place for them to go and inspect those choices that are being made to make sure that the apps are being built responsibly. So that’s one.

We also announced GitHub Copilot for Azure. Now GitHub runs across all the major software cloud vendors, right? What we did with GitHub Copilot for Azure is we took the GitHub Copilot experience, which has been very successful in market. It is one of the world’s oldest at scale Gen AI applications period. I don’t know that people remember that very often, but it’s been out there for a while. And what we did was we took the learnings that the GitHub development team had from building the Copilot experience. And we made it smarter about Azure specifically. Because as you’ve seen in all of the earnings reports, we are bringing a lot of new to Azure customers as a result of our Azure AI offerings, and AWS was first. And therefore, there’s just a little bit more understanding of how AWS cloud services work than Azure.

And so we needed to do something to help all these new to Azure developers and customers go more quickly. And so we did fine tuning on our own Azure AI models inside of GitHub Copilot, which means that a developer can now just say at Azure, I want to do blah, and it will respond to them in the way that you would build that experience, that development task in an Azure service. So really fast pathing their adoption of the Azure way of doing things, and then there are so many. But maybe one more was we announced two things around Microsoft Fabric. And we can talk about this in a second.

But for sure, at their base level, the foundational models are fantastic because they understand all the world’s languages and they understand the corpus of the world’s knowledge because they’ve been trained on it. But they know absolutely nothing about a customer’s individual business, right? And so that’s where RAG and fine tuning of a customer’s data to the models is where they get to go do things like we like to say, I am a marketer. We’re bending the curve of innovation. That’s where these rich new experiences are being built by our customers because the very, very smart models are now very, very smart about their business data.

And so Microsoft Fabric has been – it’s been a big hit in the market since we GA-ed it in – at Ignite, which was November of last year. And we expanded its remit at Build this year. We added real time intelligence capability to it. So we are going after that vast stream of at the edge IoT data, data in motion. There has not really been a clear dominant winner in that space. And so we are very excited about the capabilities we brought to Microsoft Fabric to go after that set of data to help our customers do the real time intelligence they need to do on such a large set of data.

And then we announced the Workload Development Kit, not a very exciting name, no doubt. I could probably have done better on that one. But the key takeaway on that one, and this is one of the things that got broke through applause in Satya’s keynote when he was unveiling it is we recognize that our customers’ data lives in lots of different places. And the Fabric vision all along has been to make it easy for them to go apply reporting, analytics and AI to data wherever it lives rather than put them through a very, very long journey to have to migrate. Of course, ultimately, we would hope that they will but we’re giving them a faster path to get started.

And so what the Workload Development Kit does is it allows our partners of which we have many, Microsoft has over 400,000 partners around the world, and many of them are ISVs in the data and AI space, who create data specific solutions. Think SaaS or Prophecy, Informatica, large names in the data space, they can now work with the Workload Development Kit to create what will look like a native Fabric workload experience with their technology. And so what that means is customers are excited about the investments they’ve already made, and we want them to feel good about that. We want them to feel like they can continue to get ROI for maybe they’ve invested deeply in Informatica, for example. They can now bring their Informatica data and all the experience that Informatica offers into their Fabric environment.

Joe Gallo, Analyst, Jefferies

Awesome. That’s a perfect summary of several days in a few minutes. So I felt like I was there. I want to follow up, I think it was the second point you made. But how are customers thinking about their AI road maps and timing? And we’ve heard anecdotally of customers selecting Azure because of your AI lead. Are you seeing that, is that impacting customers’ cloud road maps as well?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

I think – so for sure, every customer conversation we’re having today includes AI. And we actually put together a pretty comprehensive go-to-market strategy early last year to help customers think through all of the different options, because we’ve heard it loud and clear. There’s a lot of copilots running around out there and this Azure AI thing seems to be hunting, what’s that? And so we were trying to just help people understand what the opportunity is. And it’s evolved for sure since we first launched it in, I think, January or February, or March of last year. But it does help customers understand where those 1P SaaS app purchase options are, so the Copilot capital C.

And we see a lot of focus, of course, on employee productivity. You see that with all the Microsoft Copilot customer evidence and go-to-market that we’re delivering. We see a lot of focus on call center innovation. And that can either be through our SaaS service, which is the Copilot for Service that’s delivered through our BizApps platform or customers are attaching the Azure AI experience to ISVs that they have brought in to deliver that customer call center experience.

And then moving past that, business process automation has gotten cool again. Suddenly, everybody is talking about how can I get my Visio going of my entire end-to-end flow and then where can I apply AI to sort of accelerate that or reduce friction. I think everybody’s felt this deeply. There’s never been enough people to do all of the work across the business process chain.

And so finding ways to provide a copilot experience or to augment the humans who already have too much to do to get their work done more quickly, we’re certainly seeing a lot of that. And then the most recent one is really around – this is where I say bending the curve of innovation. Particularly since we started to bring the multimodal experiences to Azure AI, customers are building – we GA-ed the Azure OpenAI service January 16th of last year. And so it was first to market in many ways. GitHub Copilot came first, and then the Azure AI API Service that customers can go build on came next. Then we did – we put Bing Copilot out there, and all the rest of Copilots came along afterwards.

And I would say initially, a lot of people use it for enterprise search, which has never been great. It's never been easy to go find – I need to – I got to find a document. I can't remember was it in a call or was it an e-mail or was it texted to me? So for sure, there was a lot of initial adoption of just building a better enterprise search service, but that was really just kind of the beginning.

And now with multimodal in particular, we're seeing all kinds of really exciting use cases that get into the other media formats. And I should start by saying multimodal is an industry term, which I'm going to change I think we can, because it's a very confusing term because everybody also understands that applications will use lots of different models.

And I think when people hear multimodal they think, oh, different models. Multimodal actually means multimedia. So it's getting beyond text and getting into images and speech and video. So it's these other modalities. That's what it's meant to convey. One of the reasons why I think OpenAI went with GPT 4.0 to try and convey OMNi, like it's many, many different types of modalities.

But the use cases that are coming out of what happens when you can do text-to-speech or when you can do image processing or when you can create videos from scratch. Like we all know it's much easier to edit something than to start from a blank page. And so all of these new experiences are starting to come forward, and I think particularly now that we've started to offer some of these multimodal capabilities we're going to see more of that.

Joe Gallo, Analyst, Jefferies

Is there anything else further that's a gating factor to enterprises adopting AI? Is there – are people waiting for version 2.0? Are they waiting for more under-pricing? Like what can you guys do to kind of drive an acceleration of adoption?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

Yeah. I think probably maybe three things. Internal organizational readiness for sure is the number one thing that I think customers are focusing on. There's a little bit of natural trepidation in the system around what this is going to do to my role or to my job. Are we going to be able to do this in a safe and secure way? So I think like people are starting to really internalize that this is a change management opportunity within the organization unlike anything. You might say cloud computing was similar. I don't think so. Cloud computing was confined kind of to one part of the org largely in terms of getting into that mental shift of releasing the on-prem control.

With AI, it's really across the org. And so what I saw last year was a deep desire to learn and understand how we're thinking about responsible AI and what tooling are we creating for our customers to go do that safely. I think this year, we're seeing more about understanding that it's a broad company culture change, and you need an AI champ in every organization. I think the fact that agile software methodology came before Gen AI has been extremely helpful because it's empowered customers to think about I just wanted to quickly do a POC and learn and adjust and move on.

Those kinds of approaches, bringing the business and the tech team more closely together, this has been kind of a divide in the market for forever. I do think that the Gen AI projects that I've seen our customers do, like there's always this deeply connected business and tech function working together. So that's been really good. So I would say we're moving into less of a broad concern and more into, okay, I'm starting to understand how to think about doing this, and we're actioning it. So that's number one.

The second is the data estate. In fact, at the CEO Summit that we did last year, so this would have been May, so still pretty early in the Gen AI moment but kind of past that initial is this just another hype cycle kind of moment. I was doing a roundtable session with Eric Boyd, who runs the Azure AI platform, my engineering counterpart. And we had gone through our slides, and it was just kind of an informal room about this size.

And one of the CEOs of a very large global firm put his hand up and he says, does this mean I can cancel my data estate modernization project that I feel like has been going on for forever because the AI can just figure it all out? And the answer is no. In fact, if anything, you should feel really, really good about the investments you made and keep going and go faster. And we see that with customer after customer after customer. There's just massive advantage in terms of latency. These apps cannot be slow. If you have to wait a long time for your response, you've lost your users.

And so being co-located in the cloud, and I do think this is some of where that interest in Azure all up is coming from because technology teams understand this, being co-located in Azure, where the Azure AI Services are running, there's an absolute performance benefit. There are security benefits, there's all kinds of efficiencies that come from that. And so for those who are already there, like some of our earliest customers that adopted the Azure AI Service, CarMax is a great example. They had already done all the data estate modernization.

They turned Azure OpenAI Service on I think it was like February 5th or so. It's just weeks after we had made it available because at the end of the day, it's just an API call. So they already had an app. They already had their data organized. They're already running in Azure, and they were able to flip to the Azure OpenAI Service and took their – the time that it would have taken their editorial team to review all the customer feedback reviews that are about the cars that they're putting on to their website, they estimate it to be, I think it was like 11,000 days or hours. I can't remember the stat anymore, that they were just able to do in just a few hours because that's the pace at which you can move. But if the data is not well organized, if it's not modern, if it's not in the cloud, there's going to be some challenges there as well.

Joe Gallo, Analyst, Jefferies

Maybe just following up on data. We keep hearing more and more about Fabric. How are customers thinking about that? And what's unique about that platform?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

Yes. It is fun to talk about Fabric in this room. Kendra's like, the investors aren't always necessarily interested in data products, but Fabric has broken through. So thank you for your interest.

I think it's kind of going back to what I said before, which was, number one, around like there's nothing new about the need to wrangle your data. These systems were built over the last several decades by lots of different teams and lots of different places. And there – every customer is dealing with some version of on-prem, in the cloud, different point players. I think there's – we have some crazy slide that it's like in 6 point font of all the data providers in the world. There are so many tools out there, and so the challenge has always been there.

And so with Fabric, I think one of the things that's truly unique about it is we said we know your analytics needs are mission-critical to the organization. Rather than put you on the two-year modernization journey to go get the outcomes you're looking for, let's give you a way to easily connect to the data wherever it lives, which includes other clouds and certainly other partner's environments and give you a simple experience.

It's called mirroring, where you can just create a copy, not moving the data, and run your reporting right on top of that. And so the simplification, the time to production on these Fabric projects, we do win wires at Microsoft. When the account team closes a deal, they send a mail. Everyone's really excited.

With Fabric, we added live wires, which is they went live. And there's been – it's a SaaS experience, so it's pretty easy to get started. It's the same experience for the entire data team because there's 7 different workloads in Fabric. These are different teams in an organization. They're all working in the same place. And it's easy for them to go grab their data wherever it lives.

And so we're proud of the 11,000 paying customers just a few months after we GA-ed the service, and the pace at which they've been able to adopt the service, I think, has been pretty spectacular.

Joe Gallo, Analyst, Jefferies

How do you think about the evolution of large language models? Will open source continue to be relevant? And then are we just going to have a handful of models that rule them all? Or are there going to be more than that?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

We are very, very intentional when we answer this question, when we talk about this with customers. Because at the end of the day, Microsoft's core – I don't know that every company lives their mission statement as deeply as we live ours. Our perspective is always to empower our customers and partners. And so we recognize a sense of choice and selection and control is super critical to anybody making a decision about their tech platform or any individual service within it. And so we have nearly 1,700 models in the Azure AI model catalog. So that's what lives within that Azure AI Studio I mentioned earlier.

And it is a huge collection of open-source models. We have a great partnership with Hugging Face that we just extended at Build last week that Satya announced, which gives us some of that at-scale ability to bring in new models. We partnered with a company named HiddenLayer to go apply safety scanning on those open-source models, which is really interesting. When you think about open source in and of itself, customers, again, especially in our enterprise accounts, they are very concerned about making sure that the – any tech that they bring in is safe and secure, and so HiddenLayer is now scanning every single one of those nearly 1,700 catalogs.

And then in addition to our amazing partnership with OpenAI, we are also building our own family of models called the Phi family. And this is Microsoft research developed in partnership with our Azure AI team and Kevin Scott's organization. And they are SLMs. So we've got LLMs. We've got SLMs. I've not seen MLM come up yet, but I'm certain it's around the corner. And so I do think it's right and it's expected that customers will want to continue to have that choice.

We just want to make it easy for them to pick the right model, and that's really where the conversation is going. I think we're past the one model will rule them all. It's more about, what is my price point, what is my use case, what are my unique requirements, and which model is best for that. And the performance and evaluation tooling that we built into the Azure AI Studio makes it very easy for developers to go in there and do those kinds of estimates and just pick the right model for the job.

Joe Gallo, Analyst, Jefferies

Makes sense. We're out of time. But maybe, if we're here in one-year, which I hope we all are here, what's one thing you're hoping that we're talking about?

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

That's the fastest 25 minutes of my life. I would say it's the multi – first of all I hope we're calling it multimedia or something other than multimodal. So I'll get right on that. I think it's – the world deeply understands the chat experience at this point. Getting into some of the more interesting multimedia experiences, listening to like we're having amazing conversations with CMOs right now, which has not necessarily been Microsoft's jam in the past, right, about what they can do from a campaign development for marketing asset creation.

There's so much there that I think we're just scratching the surface of. And then some of the more connected analytics to Gen AI applications. That to me, that's the other big frontier that's sitting in front of us where a lot of what's been done today is based on unstructured documents or unstructured data as we refer to it. Think product docs. That's where that call center enablement is happening. Like it's been a problem for 2 decades. Product team changes something. They try to send mails. They try to do trainings. They try to do all these, informs the person on the edge, who takes that first call.

The first call is always really tough. Well, the models can read all the product documentation that developers do by nature of their role, and so that first call can be augmented with the Copilot experience to make it a little bit easier as they're transitioning to the new version of the service, which is when the customers call. So all of those things, I think, are getting really well understood. But connecting that to analytics outcomes and what customers can then learn from what's happening in the call center and put back into their product innovation life cycle, I think that's another one that's going to be really exciting.

Joe Gallo, Analyst, Jefferies

Awesome. Jessica, thank you so much for your time. Thanks, Kendra. Thanks, James.

Jessica Hawk, Corporate Vice President, Data, AI, and Digital Applications, Product Marketing

Thanks, guys.

Microsoft Corp (MSFT)

ar2023


2023 ANNUAL REPORT

VIEW ONLINE 

DOWNLOAD NOW

 

'max-age=0,s-maxage=900' $add_header('Cache-Control', $(xfMaxAgeHeaders))

Follow us

Share this page

'max-age=0,s-maxage=900' $add_header('Cache-Control', $(xfMaxAgeHeaders))