CarahCast: Podcasts on Technology in the Public Sector

State of the Data Center Today: Dell & NVIDIA

Episode Summary

This podcast explores the rise of AI factories - data centers built to convert energy and data into AI “tokens” - as NVIDIA and Dell detail how federal agencies must rethink power, cooling, and compute while leveraging their AI Factory partnership to scale token generation, speed model training, and extend hardware performance for massive AI workloads.

Episode Transcription

Scott Robohn

Welcome to the Carahsoft Datacenter Innovation Series, where we cover the Carahsoft solutions with our partners in the booming datacenter market. We'll examine the tech, trends, and issues shaping the future of datacenters in the public sector. I'm your host, Carahsoft Consulting CTO, Scott Robohn.

 

As we kick off this series, we have two very special guests from key Carahsoft partners in the datacenter ecosystem, Dan Carroll, Field CTO with Dell Federal, and Shane Shaneman, Senior AI Strategist with NVIDIA. Welcome to you both. Thank you.

 

Listeners, you're all aware of the position of these two companies and their impact on public sector datacenters. We'll be touching on how these companies are actively shaping AI infrastructure, the mindset shift to building AI factories from just building datacenters. Token generation is a new key metric that's new for a lot of us to hear about and much more today.

 

Welcome, Dan and Shane. Thank you so much for being with us today. As we get started, could each of you give our listeners a brief intro, your history in the datacenter world, and your role?

 

Shane, let's start with you.

 

Shane Shaneman

Oh, absolutely. So, Scott, thanks so much for having me. So, my name is Shane Shanneman.

 

I'm the Senior AI Strategist for NVIDIA's Public Sector Group, joined NVIDIA about 18 months ago. So, my focus is working across the public sector, federal government, Department of Defense, intelligence community, and basically helping leaders basically embrace where AI is going and how they need to basically have the AI-enabling infrastructure, the AI factories, to be able to power it. Prior to joining NVIDIA, I spent seven years at Carnegie Mellon University helping to manage all the national security and defense research, and then prior to that, a couple of different industry positions, one most notably with Corning, looking at how we help build out fiber optics for datacenter markets as well.

 

Dan Carroll

Great. Welcome.

 

Shane Shaneman

Thank you.

 

Dan Carroll

My name is Dan Carroll, Field CTO for Dell Technologies Federal. I've been with Dell about seven years, and the CTO role is kind of interesting. I've been in government about 30 years.

 

Started out in the Marine Corps as an in-service, so have a lot of experience, I would say, from that side of defense, so being a consumer of IT and consumer of these types of technologies. And what I would say is I do a lot of work in prognostication, right? That's a key role of the CTO, is trying to figure out where is everything going?

 

And it's kind of interesting because we follow these trends that come up in IT all of the time. AI is not a trend. I will say that out front with all of this, AI is a true monumental way the government is going to change how it leverages their data to make better decisions.

 

Scott Robohn

Well, that couldn't be a better tee-up. Get your AI-powered crystal balls ready for the prognostication that you'll deliver during this conversation. Now, this is the Carahsoft Data Center Innovation Series, but we had a really interesting part of our prep where there's a thought, a shift in mindset between data centers and AI factories.

 

And Shane, let's start with you. What do you mean by the shift toward building AI factories versus just building data centers?

 

Shane Shaneman

Yeah, so we've had data centers for decades where we're basically storing data, processing applications. Sure. And the shift really is because of this evolution with generative AI that's really transforming what we can do with computers and enabling us to turn around and basically use human language to interface with artificial intelligence rather than having to worry about knowing Python or C++ or C Sharp or whatever the case may be.

 

And so the shift really has come into the fold that rather than just storing data, we want to be able to have both energy and data as the inputs to the AI factory, and the output becomes intelligence in the form of tokens. And those tokens can be applied to any number of industries, whether it's transportation or healthcare or manufacturing, to basically drastically enhance and transform that industry and the businesses that are operating within it. And that's what we mean when we talk about the shift towards an AI factory, is that the output, it's producing intelligence in the form of tokens.

 

Scott Robohn

And that makes sense. It's a helpful framing concept. It doesn't mean we're done with data center infrastructure, correct?

 

Dan Carroll

Oh, no, no, no, no, no, no. Not at all. So from our perspective in working with NVIDIA and working with the federal government, what has changed is how you build your data centers, right?

 

Before, it was really focused on the storage elements and how much data you could hold in the data center. New AI data centers or high-performance compute data centers require complete briefing, right? The amount of power needed to do the compute, which provides all the resources to do what Shane is talking about, requires the incredible amount of cooling and power to do that.

 

And most legacy data centers will either have to be retrofitted and basically re-evaluated for their current infrastructure to see if they can support it. And if they can't, can they get the right amount of power in there? Can they get the cooling in there?

 

And new data centers that are built out, that is the first thing you're thinking about, is that kind of the elements. And just to touch on it, we'll probably go further down the road, that has been a big point of discussion in some of the things that came out around the AI, America's AI plan, as well as some of the information coming out of Department of Energy for the shared land for AI data centers.

 

Scott Robohn

Some big announcements over the last couple weeks. I think it makes sense. As of the timing of this recording, we're in early August 2025, and we're just within a couple weeks of the latest round of DOE land announcements and the administration's AI action plan.

 

What's striking you both about this? We're going to get back to the token issue and what's a token, but what's hitting you most in the last couple weeks after you saw the plan get published?

 

Shane Shaneman

I would start off with just the sense of urgency that is coming out and the urgency to win the AI race and understanding the critical role that infrastructure plays in order to enable the rest of the functionality that the government is looking to basically harness and take advantage of. And that's really what it comes back down to, is looking at how they build out the thrusts to build that foundational infrastructure that then powers the rest of the AI stack.

 

Dan Carroll

Absolutely. And that's what's interesting, too, is that there have been IT initiatives in the past. Five years ago, Zero Trust.

 

Zero Trust was a forefront in defending national infrastructure, and it still is critically important. But you did not see the unification of both the commercial and federal sectors around driving towards an IT goal like this with AI. And the reason being is, in order for the United States to maintain its competitive advantage on the world stage, AI is critical.

 

And everything that has been called out in that action plan is a hard necessity for us to meet those goals.

 

Scott Robohn

And as we were chatting about trying to catch up on the content and analysis from it, things I've seen across the board are there's no argument on the why. This is a clear national priority. There might be some discussion on the hows.

 

So I'll throw that to you two. How are you collaborating, NVIDIA and Dell, to shape that how, especially in the light of the structure? Do you have initiatives that you want to call out or things that you're doing specifically?

 

Dan Carroll

Sure. I'll start with this one. The Dell NVIDIA AI factory is absolutely probably one of the biggest initiatives started between two corporations that I've seen, unification between the two companies and getting around in the federal space what we're calling a framework to help organizations understand how to not just consume AI, but how to understand how to apply it to their use cases.

 

Too often, people will go, oh, that looks delicious. And then they'll eat the meal, but they really don't know what went into it to be prepared. And for AI, it's much more critical for organizations to understand how to prepare their meal in order for it to take an effective approach to consumption.

 

Scott Robohn

Kind of matters, right?

 

Dan Carroll

It does. So the AI factory between Dell and NVIDIA, what that does is it allows an organization to bring in the use case. We're able to work with them to understand what the requirements are.

 

And then we figure out the appropriate services, hardware, and software to tie around that to do incubation and testing and make sure that it works effectively before you feel the very expensive solution later, right? So it's really all the way around a good approach for cost savings and application.

 

Scott Robohn

And Shane, in your commentary here, so many people go straight to focusing on GPUs with NVIDIA and don't necessarily understand the software ecosystem that you also have in place. How does that play into the AI factory cooperation that you're working on?

 

Shane Shaneman

Absolutely. And the relationship that we have in the partnership with Dell, if we think about how we're going to scale these solutions, the only way that we can scale is through the incredible partnerships like we have with Dell. And really, it comes back down to capitalizing on the chips, the systems, and the software that NVIDIA builds as part of our AI platform.

 

And then working with partners like Dell to operationalize it in the most cost-effective way that we possibly can. And that's really what it gets back down to. And Dan highlighted that, hey, we're leveraging this to solve problems.

 

And I always like to go back to, that's really why we develop an AI. And that's why our customers are looking to leverage an AI platform. It's to solve a problem and to empower their workforce.

 

And we have to be able to do that at the speed and scale of mission.

 

Dan Carroll

And I will add, the one thing that's always interesting with NVIDIA is, like you said, a lot of people instantly jump to the GPUs and all that stuff. The software development going on at NVIDIA and the fact that newer generations of software are making older hardware more valuable. Are you kidding me?

 

Like that does not happen with most vendors. They don't improve, like double or triple the performance of a hardware platform based on just the software they're developing.

 

Shane Shaneman

And if you look at the software libraries, key to point is to really capitalize on the acceleration. All too often, we'll have people come in and they'll talk about improving a process and they're going to improve it by 20% or 40%. Well, when we talk about codifying a workflow, we're not talking about a percentage increase.

 

We're talking about the power of X. And our typical acceleration is anywhere between 50 to 75 times what was possible with the CPU-based compute infrastructure. So when we start looking at these workflows, we're able to compress preparing our data for algorithmic processing from literally two and three weeks into two to three days.

 

And that's the partnership. And quite honestly, being able to operationalize that for customers is really the power of the Dell NVIDIA partnership.

 

Scott Robohn

Yeah, I'm seeing that through things that are flowing through Carahsoft as well. So I can't speak to it as eloquently as either of you can, but I can attest to it for sure. One of the really interesting things here, as you bring this up, that acceleration through software really helps customers consume and keep GPUs in service for much longer.

 

You have this rapid innovation in hardware, which we want and expect, but to extend the life of the hardware piece of the ecosystem through software improvements, that's something everybody can benefit from. 100%. Now, I want to go back to some of the opening statements on tokens.

 

There's a lot of us who this is a new concept for. We're used to number of cores or flops or the amount of RAM or MIPS as the traditional compute-related metrics that we think about. Tell us about tokens.

 

What is a token? We can go through two levels of explanation. Maybe something really consumable for the masses, and then maybe a little deeper, and tie that back to AI factories are taking these inputs to generate tokens.

 

Shane Shaneman

Absolutely. First and foremost, tokens have been around for over a decade. It's a measure of compute.

 

If you think about tokens, and we'll talk about this in a large-language model construct, a token could be a part of a word. It could be a whole word. It could be a couple of words, depending on the size of the words.

 

The input basically turns around and is then used to be able to turn around and measure the prompt that we're making to a large-language model in the number of tokens. If I'm attaching documents that I want the large-language model to summarize, that gets basically broken down into a number of tokens. Then the output from the model comes in the form of a number of tokens.

 

If I look at the problem that I'm trying to solve, and how I'm going to try to leverage the AI factory in solving that problem, I can also turn around and start breaking it down into, this is the number of tokens that I need to empower my workforce on a daily basis. How many tokens per employee do I think that they're going to need this year? How many next year?

 

As we continue to see these models evolve, and if we go back to this fundamental shift from data centers to AI factories, we look at object detection. A typical object detection model, we're talking maybe 20 to 30 million parameters. We've seen the model parameters explode from millions to billions, to hundreds of billions, even trillions of parameters.

 

These models are now like 20,000 to 30,000 times the size of that object detection model that we were so impressed with, literally less than seven years ago. If you think about, ChatGPT came onto the world stage November 2022. Since then, we've seen this rapid explosion in the size of the models, because it makes them more generalizable.

 

It allows us to turn around and basically apply them to solve a multitude of different problems. All this comes back down to being able to measure that in terms of tokens.

 

Dan Carroll

I think the key is, too, most problems will require multiple models to solve. I'll give you an example, AI Assistant. There's been a lot of discussion around using AI Assistant to help patients get checked in.

 

When you think about that patient coming in, you can get into language translation. You can get into dialect translation now with AI much easier, which makes it more effective. We all are speaking English.

 

We can understand very easily when you get into other languages, dialect matters. I'm understanding you guys okay. We're doing pretty good.

 

First, you have language models, like just language translation models. Then you have the, okay, the person wants to check in, and they're trying to figure out where they're going. Then you could be getting into models that are built to help with workflows, to help with documentation, to help with scheduling.

 

There's all these models that come together. Just like one person can't solve a problem, one model doesn't solve everything for usually a use case. It's usually multiples bringing that in, which is where the token analysis becomes very valuable because you're trying to understand your costs for pursuing that solution.

 

Shane Shaneman

Being able to look at what is the most cost-effective way to turn around and execute? Because now we just don't have one scaling law for AI. You go back seven years ago, and if I want a performant model, the more data I have, the better the model.

 

That was the scaling law. But now with generative AI and agentic AI, we don't just have one scaling law. We have three scaling laws.

 

You have your pre-training, but you also have post-training and fine-tuning. Then you basically get into scaling as I deploy this model across my entire enterprise for long thinking. This is where being able to basically make use of those reasoning capabilities that, let's be real, just started coming out literally last November and December and have already transformed the entire generative AI space.

 

That long thinking becomes so critical for organizations to be able to factor in, to understand how do I scale? How many tokens do I need? And what is the most cost-effective way for me to turn around, and I like to call it token generation capacity, to know that I have the capacity to generate the tokens that my workforce needs, that I can empower them with.

 

And then part of that is also understanding and forecasting what is my token requirement? How is that changing over the next two, three, four years? And this is where the relationship between those government agencies and critical partners like OpenAI, Anthropic, Microsoft, as well as the whole host of ISVs that are developing applications, building on top of those open source models.

 

But being able to bring it back down to a metric, and this is the power of the token. The token is really the core of the new industrial revolution because it's the output of all the AI factories. And it produces intelligence that then transforms those industries.

 

[Dan Carroll]

So... Well, and I want to hit on something for everybody listening that Shane touched on there, which was that it's still early days. There are a lot of people feeling like they're already behind the ball, that they don't think they're...

 

How do I get started? Just start. So the AI action plan that came out, there was a lot of discussion, of course, and focus on it.

 

It's incredibly timely because there's a lot of federal organizations that have been talking about AI for the last couple of years. But there's just so much advanced development and the capabilities that we have delivered. We just talked about it within the last couple of months to a year.

 

In what's coming, it's just incredible. So there's still plenty of opportunity to get in the ballgame now. Don't wait.

 

Shane Shaneman

Yeah, I was going to say, Dan, that's the biggest... Leaders are waiting for this pace of innovation to slow down, for things to plateau. And this is not a typical S-curve.

 

This is an exponential curve. And you can't afford not to be on the right side of the exponential curve. And that's the biggest lesson, is get your hands dirty.

 

Start turning around and integrating this technology because the human machine team aspect of this is huge. The organizational trust aspect of this is huge. And the lessons that the organization will learn as they start integrating this technology as part of their tactics, techniques, and procedures is huge.

 

Scott Robohn

I would say from my own personal experience, one of the things to be thinking about here is absolutely 100% agree. Just start playing. Start getting used to using a chat-based tool.

 

That's the right place to start. Know that that's not where it stops. But think about how it can change how you get your work done, what your workflows look like.

 

And most people don't think in terms of workflow. Sorry, I'm an industrial engineer undergraduate, and I go way back to process. You have new tooling, power tools, in the form of agentic AI, digital co-workers, if you will.

 

That's a phrase a friend of mine is starting to use. I'm sure it's out there. Think about how you can do work differently and better, not just doing what you do with the occasional chat tool.

 

That's where much of this power is going to come from.

 

Dan Carroll

Well, to your point, the agentic AI and the digital worker is real. Dell has already started putting it basically in our org charts. We have these many physical workers, we have these many digital workers, and the digital workers are assigned to managers.

 

Scott Robohn

And they consume tokens. They do.

 

Dan Carroll

They have a cost and they have an output. They have goals they have to meet. Like the rest of us.

 

Shane Shaneman

Absolutely. And one of the things that NVIDIA developed, and I know we work very collaboratively with Dell on this, is developing blueprints to help customers. Instead of starting with a blank sheet of paper, you've got a blueprint.

 

How do I develop an AI agent for different functions? And so just through build.nvidia.com, customers can go out, they can play with the models, all of our NVIDIA inference microservice NIMS are available, as well as all of our blueprints. And it's a great way to be able to leverage off and have a consultative discussion, engage Dell, talk about the problems that you're trying to solve, how AI agents will be a part of that.

 

They can then help look at the AI factory that Dell and NVIDIA have partnered together to develop, and what size factory is the right fit for what they're trying to do.

 

Scott Robohn

For sure. You're both doing a great job of making some of these complex concepts consumable by myself and listeners. What has this been like for you with your federal agency engagements?

 

I'm not trying to pull anything out or embarrass anyone. What are some of the things that you've really had to educate on and that you've seen lights go off on?

 

Dan Carroll

Well, I'll tell you probably the biggest thing, and this kind of leads back to the beginning of the discussion, is data center readiness. They are all excited to go and try to figure out how to tackle a workload. Helping them understand the requirements for that data center has been something that usually gets people, because everybody's very excited to scope out the GPU.

 

They're very excited to scope out the storage and it's like, well, do you have enough power coming in? Is there enough power, not just coming into your building, but in the area available for you?

 

Scott Robohn

Right.

 

Dan Carroll

Which is probably the biggest challenge that we're seeing in the AI space. It isn't unique to the government. The need to find available power sources to drive these data centers is going to be critical.

 

And I love the unique thinking that's coming up across both commercial industry and in the federal space. So we touched on it. The federal government is allowing lands used by the Department of Energy to be co-located with AI data center built for both commercial and government buildings, which is great, like government workloads.

 

And then, of course, if you get into what I'd say is some of the commercial applications and what they're doing, what I'd say is newer investments into newer power solutions. Nuclear is a very interesting area as things are going. And I think it's the right move.

 

Microsoft is an example. They purchased power supply from a reactor that's coming back on through Nile Island. Google has made massive investments into smaller companies that are doing the small modular reactors.

 

Dell, working with its partners, is looking at SMR as well as hydrogen cell as power sources. And one of the other things that gets overlooked, AI and HPC are going to manage those power grids. It's multiple applications.

 

It's a new world for data centers. It isn't just the old, we stick the power in there, but how to manage everything outside the data center to be more efficient.

 

Scott Robohn

That's a great point. And Shane, I want to hear your piece on this too. But one of the things that I've noticed is take any application area and we talk about the infrastructure that we need for AI.

 

I come from largely a networking background, so I think a lot about what network infrastructure needs to look like to support large clusters in this elegant, complicated dance that is low loss and low latency. We have the flip side of how will AI empowered tooling, management tools, orchestration tools, help manage that dance, help manage the grid and help manage air traffic control. The coin is really interesting when you look at both sides of it on a market by market basis.

 

Shane Shaneman

100%. And part of this is it is about empowering the workforce and it's about human augmentation. So as we look at the spectrum of different applications for artificial intelligence and how AI factories are going to start to transform the very fabric of our lives.

 

It's going to impact every aspect from education, how our kids learn. Every child will have an AI tutor. Every child will have an AI tutor.

 

Every employee will also have an AI coach that helps them learn and get better at their job and prepare for the next job. And also take more and more of the menial tasks that they have to do on a daily basis off their plate. But it comes down to that human machine teaming aspect and the trust that you have to be able to build.

 

And that's really kind of just that synergy that we have to be able to turn around and kind of drive. But it does come back down to, and you go back to the AI action plan, you've got to have the infrastructure in place to be able to support that. You've got to have the token generation capacity to be able to enable the rest of the applications.

 

Especially as you talk about being able to go from, as we look at this roadmap, going from what I call knowledge retrieval with retrieval augmented generation and chatbots. Going then to reasoning. Now I'm thinking about long thinking and multiple iterative processes, chain of thought, solving larger problems.

 

And then ultimately, in the background, as you go up that roadmap, the number of tokens that are needed per employee increases dramatically. And this is why being able to turn around and work with a company like Dell to help customers understand that, to be prepared and design the AI enabling infrastructure and the AI factories that will power that is so critical.

 

Scott Robohn

So let me ask on that in particular. And sorry, Dan, we'll give you the mic back in a second. So are you seeing, how do we, how do you make your customers literate on how to forecast tokens?

 

Dan Carroll

So here's my opinion, is that you have to have a different discussion with your customers about what they're doing than you did in the past.

 

Scott Robohn

I bet there's a whole lot of different discussions going on.

 

Dan Carroll

Yeah. So before, like let's go back to the traditional data center. It was like, how much data are you trying to store?

 

You know, how big, you know, how fast access do you need to it? Those types of things. Those are old questions.

 

Questions now are, what are you trying to do to improve or augment your workforce? What is the mission or business challenge that is keeping you from accelerating and being more successful?

 

Scott Robohn

And that has to be the business facing, the mission facing end of that conversation.

 

Dan Carroll

Oh, 100%. Sure. You have to talk to a lot of different people.

 

You get way outside the data center when you're talking about AI discussions because you have to talk to the people that usually really touch the mission. And that then leads back into the discussions of like we're talking about here, what kind of infrastructure you need to support that kind of build out and what kind of token allocation is going to be required to build the solution. Yeah.

 

So it is a much more intimate discussion about what they're really doing day to day and what is and isn't working. It's really more of a process work discussion versus a computer process discussion.

 

Scott Robohn

And the industrial engineering management consultant side of me says 100%. Right. Yeah.

 

What is the problem you're trying to solve? It's one of the most clarifying questions anyone can ever ask. But that does get translated into, okay, on the back end, how do we start thinking about how many tokens will this consume?

 

Is there, are there tools that you have, education that you jointly have to help drive that as a metric? What's that look like?

 

Shane Shaneman

So that's something we're jointly developing is to be able to turn and help customers do what we call a token gap analysis. What is your token generation capacity? What is your forecast token requirement?

 

How do those two lines meet? Sure. Okay.

 

And then once you have that and you have an understanding, then you can go to the real discussion which is how do I maximize token impact? Sure. Okay.

 

How do I get the most use out of the tokens? Whether I'm a commercial industry, whether I'm a federal government agency, how do I make sure that the tokens are producing the biggest impact for my organization that's possible? You know, and that starts where you get into the prioritization.

 

And we have to keep in mind, you know, developing new AI applications, okay, we've gone from it taking months and even years and literally new applications being able to be developed off of open source models in a number of days. So the opportunity to iterate, okay, with the workforce to help solve their problems and to empower them, you know, literally this process has gone from, you know, six, nine, and 12 months to six to 12 days. Yeah.

 

Dan Carroll

And this all plays back into that Dell NVIDIA AI Factory framework that we were talking about. That's all part of that early discussion and discovery. And part of that is the buy versus build.

 

Scott Robohn

Sure.

 

Dan Carroll

Right? There are a lot of ready applications that you can purchase through some of the partners that you mentioned a few minutes ago that can accelerate your time to application deployment. If you have to build, that doesn't mean that it's going to take necessarily a lot longer.

 

It just means you're going to be using possibly more open source with some boutique programming coming in that is done much faster because AI is available to help you do that coding.

 

Shane Shaneman

And the thing that, you know, just really blows me away is if you look at how the industry is evolving and embracing, okay, you're seeing new companies coming out looking at how they can provide intelligence, you know, to the manufacturing industry. How they can apply intelligence that the tokens provide, okay, into the transportation industry. It's limitless.

 

And that's why, you know, if you listen to Jensen, our CEO, he talks about this as the dawn of a new industrial revolution. And it's such a special and unique moment in time and the impact that it's just going to be transformational.

 

Scott Robohn

I couldn't agree with you more. And one of my aha moments was a few weeks ago at a, an unnamed large cloud provider summit in New York City. You can connect the dots a few weeks back where, like, one of my main takeaways was that, that cloud provider used to really focus on two people in a garage with a credit card to let anybody get started with their IT.

 

Now, with what they're doing to bring different models to a marketplace and developing an agent marketplace, now it's one person with a credit card. Um, this age of ideas is really taking off whether it's people sitting listening to this, it's product managers. I can now take an idea and come up with, let's call it a prototype, right, a rapid prototype very easily.

 

Um, and that's just one piece of the transformation puzzle I think you're, you're talking about here.

 

Shane Shaneman

And if you combine what we're seeing with generative AI capabilities, not only in a large language model, okay, but, but to be able to turn around and look at like cursor and to be able to turn around and buy code to turn around and basically produce new capabilities. Right. The combination of those two is unbelievable.

 

For sure. Yeah, you're seeing, you know, a friend of mine, his five-year-old, developed a video game by being able to turn around and leverage those different capabilities. That's amazing.

 

Uh, so the ability for somebody that has no background in a computer language, okay, to develop these applications and to be able to apply them, you know, back to solving problems. Uh, and that's just the power and the transformative impact that we're going to see and why these AI factories are so important.

 

Scott Robohn

So taking that, you know, Dan, you gave us great encouragement to just try this stuff, just get started. 100%. Um, but there is this issue of how fast things are moving.

 

Right. And, you know, not to do a brief history of the internet here, but, you know, I've seen all these layers that have built up over the decades. We built a connectivity platform that let cloud emerge and the ability to aggregate lots of compute, lots of storage, lots of network, um, if you had a large enough credit card limit.

 

Um, you know, that gave us new consumption models. And all the while, you know, AI is not a new technology, right? From the 40s and 50s to today, there's been, you know, developments there as we've gone that YouTube both can speak to much better than I can.

 

But now these things have come together at this point in time that let's just, let's just use, you know, late 2022 as that latest push, um, that unleashed generative AI and now got us interested in all sorts of other branches. Things are happening so fast. Buying books is no good.

 

Um, trying to keep up with blogs and podcasts is somewhat helpful. What are YouTube both doing to help your customers stay on top of the stuff and keep up to speed as much as they're able? So this is an interesting question, right?

 

Dan Carroll

It's, it's the commercialization of AI is what happened in 2022. Sure. When, when people who traditionally don't stay up with IT now know of a technology and are using it daily.

 

Right. That's when it really just escalates and just, just spins out. And it's amazing.

 

And what I will say is, I think younger generations that are coming in, like, like you pointed out, they'll be very familiar with this technology. It's just going to be part of who they are. They'll grow up with it.

 

Just like kids, you know, grew up in 20, you know, 2005, grew up with cell phones, right? Um, the challenge is upskilling your current workforce, right? Helping and making them comfortable.

 

And I think that is the biggest point of consultation organizations need to have with their people is understand how to adopt and adapt these new capabilities into their normal workflows, right? Uh, the best solutions in the world I've seen get laid at the doorstep and never get integrated into a workflow.

 

Scott Robohn

Right.

 

Dan Carroll

And into how people do their business every day. And, and it's, it's a bummer. That's what you, you know, that's where Dell, working with NVIDIA, working with their partners, providing education, uh, working outside of the data center.

 

Like we said, data center discussions are critical. Sure. But talking with the management, talking with the people who deal with training, you know, training the workforce and understanding how they're going to educate those folks on how to take advantage of all these capabilities is critical.

 

Shane Shaneman

You know, and part of this, I think, goes back to, you know, within NVIDIA, um, the pace of innovation continues to accelerate so much so that we call it Model Monday. Okay? Because we have new models that come out every week.

 

And understanding, you know, how those models have been fine-tuned or how they've been specialized.

 

Scott Robohn

Do you believe it's only one day a week? Yes. Model Monday and Model Thursday, too.

 

Shane Shaneman

But so, you know, just being able to turn around and boil down.

 

Scott Robohn

Sure.

 

Shane Shaneman

Like here's the transformative impact that's happening. So one of the things NVIDIA hosts is, is we have our GPU technology conferences. Uh, so we have our annual one that we do every year at the end of March.

 

We have one that's going to be coming up that I think for, for the audience of this podcast, and it's going to be the end of October, GTCDC. Uh, great opportunity for folks to be able to come in. It is developer focused, but it also has tracks for those decision leaders.

 

Just to be able to stay abreast of what is the art of the possible, okay? How are other people leveraging this technology and artificial intelligence and tokens, okay, to transform their operations and their business, right? Um, great opportunity to be able to turn and just see how everything's coming together.

 

And most importantly, the incredible ecosystem, okay, that's forming around this technology.

 

Scott Robohn

I will say, I, I was able to go to my first GTC ever this year in 2025, and there is something for everyone. Oh, there is. Um, the worst way to state that is your agenda choices are overwhelming.

 

Yes. And you need an AI assistant to just pick the right sessions. Um, and it probably was available and I just didn't see it.

 

Um, but it is amazing to see all the application areas and for anything you wanted to dive into. Um, so for the, um, GTC DC, it's that end of October. Yes.

 

I highly encourage anyone listening that wants to go deeper to please do that. Dell must be doing similar things. And I, I know on big scale and small scale from the, some of the events we do together between NVIDIA, Carahsoft, and Dell.

 

Dan Carroll

Yeah, we're very large participants at the GTC event and we're participating in the GTC event, GTC event in DC. So yes, absolutely. And then, um, internally we're doing what we call AI solution spotlights for all of our sellers because there are so many new ISVs coming to the table.

 

Constantly. They can bring new capabilities and, and new ways of thinking on how to solve problems.

 

Scott Robohn

Right.

 

Dan Carroll

So, we have to keep up with that pace that you're talking about. What is, you know, who is the, the newest, uh, guy on the market who's bringing something great to solve an issue. So, educating our sellers on not just capabilities but how to use them and how to have an effective discussion with an organization to help them solve a problem.

 

Scott Robohn

Well, we could go on for a long time. I do want to bring this down to one last discussion point. Sure.

 

Um, I, I promised you that you would have the opportunity to exercise your, uh, your AI powered crystal balls. Um, what's, what, what are one or two big things that you see coming? And I don't want to say in the next year.

 

That's too long of a timeframe. You know, what, what's coming in the next couple quarters, months that people should really be paying attention to?

 

Scott Robohn

Yeah.

 

Shane Shaneman

You know, I think one of the most transformative, um, activities, okay, is just the interface protocols that are evolving. Sure. Okay.

 

So, model context protocol, being able to connect models to other enterprise resources through uniform APIs. Right. Okay.

 

That's a huge, okay, the ability to have models connected to other models, auto discovery of resources. Um, that allows for us to integrate these capabilities much more seamlessly into the workflow so that what we're not doing is we're not adding to the cognitive burden of the workforce by giving them, you know, something that they have to pivot between. I can go over here to this chatbot, ask a question.

 

Sure. And then I, as the human, need to be the transfer mechanism to take that insight and that intelligence and apply it on this other system to the workflow that I'm responsible for executing. Being able to integrate that capability as part of that workflow really is where we start to turn and see the tremendous impact.

 

Yep. Okay, that generative AI and then eventually agentic AI is going to have.

 

Dan Carroll

And that's what I was going to call out is agentic AI, right? We are at the, not even at the doorstep, we're down at the bottom of the stairs of where agentic AI is going to go. Right?

 

We were talking about the digital assistance a minute ago. Right. The capability for agentic AI to have an impact on that to not just do simple tasks, but to do multi-level complex tasks.

 

Right. To solve issues and problems is really what I'm interested in. So the idea that, I'm just going to give an example.

 

So, you know, hey, I need to plan an event. Okay, I'm going to have an agent that helps me figure that out, figures out venue, figures out location, figures out attendance based on the topic I'm looking for. Oh, there was a storm coming in and it's affecting the venue.

 

Sure. Do we need to shift the venue? Yeah, absolutely.

 

The agent will take that into account and be able to re-plan and re-discuss and re-figure out your planning and what you're going to lay out there. Right. That's what I mean by that multi-level kind of engagement.

 

Right, where normally there'd be like three or four people involved. You will have agents with sub-agents that are making all these decisions with certain levels of human interaction and control, but doing much more complex tasks.

 

Scott Robohn

Yep. It's not hard to see that just over the horizon. It's not even over the horizon.

 

Shane, what about you?

 

Shane Shaneman

Yeah, and I was going to say, we already have government agencies today. That's really where they're focused is they want to be able to turn around and deploy agentic orchestration of generative AI models and capabilities. You know, they want to be able to connect those models together to be able to solve larger problems, okay, but to offload that so that it's working in the background, okay, but to tie this back to, because this is the Data Center podcast, okay, to be able to make all those agentic systems work requires a lot of tokens.

 

It does. Okay, and you have to be able to plan for that because part of this, just like we talked about, there's AI scaling loss. Right.

 

If I want to deploy that agentic capability across my workforce, I have to make sure that I have the token generation capacity to support it, and there's no, there's no better company to interface with to be able to have that discussion and understand that infrastructure that's required than Dell.

 

Dan Carroll

Absolutely. I appreciate you saying, yeah, the Dell and video relationship has been incredible, and unfortunately, it's a, it's one where we're coming to the close on this because we didn't even, we hit the tip of the iceberg. We didn't get into modular data center builds and, you know, speed to build and, and, you know, basically the configurators to figure out all the compute that goes in there, right, and the different capabilities.

 

So, yeah, hopefully we have a following discussion.

 

Scott Robohn

Well, that's, I couldn't have been, you teed that up better for me. Thank you so much. You know, this is the start of the Carahsoft Data Center Innovation Series where we've got some great partners that you all know and work with lined up to talk about some of the things you mentioned and much more.

 

It also gives me a hook to invite you both back. We can go, we can dive on any of these things that I know we, you really didn't get to exercise your muscles very much today. I'm sure there's a lot more we could talk about together.

 

Shane Shaneman

Yeah, well, you know, one of the things I do want to highlight is Carahsoft is such a tremendous partner for both Dell and Nvidia. So, Scott, thank you so much for orchestrating this and thank you to Carahsoft for, for bringing such great information to our customers.

 

Scott Robohn

You are quite welcome and I will say the role that I've had at Carahsoft and the interaction I've been able to have with Nvidia, with Dell and just, I learn something every day about what you're doing individually and together and what's, what's happening to change this market. We've said it, I can't emphasize it enough. This is a time of incredible change.

 

Most of us have never seen anything like this. We may not see anything like it again for a long time, but let me, we'll do the last round here. Anything else you'd like to emphasize or leave the listeners with?

 

Dan, I'll give it to you first.

 

Dan Carroll

Yeah, absolutely. Partnership is critical. You know, Shane's been talking about the partnership between Dell and Nvidia and of course, the partnership with Carahsoft, but partnership with the ISV community, partnership with the integrator communities, which are incredibly critical for doing business within the federal government because they stand closer to that mission and business goals for that customer than anybody else.

 

So yeah, partnership is how organizations are going to take advantage of this opportunity.

 

Shane Shaneman

And I'll just highlight, you know, it's such an exciting time. We are literally at the dawn of a new industrial revolution. Okay.

 

And the choice is on us whether we want to be on the right side of the exponential curve or the wrong side. And with the power of Dell and Nvidia, we can help customers be on the right side of that curve 100%.

 

Scott Robohn

Well, Shane and Dan, thank you so much for being here. This is just the beginning. We have so much more to talk about on the Carahsoft Data Center Innovation Series.

 

Listeners, thank you for paying attention and being with us here virtually today. If you have any questions, please follow up. We can route you to Shane or to Dan and their colleagues at Dell and Nvidia respectively.

 

Thanks for tuning in and we'll talk to you next time.