MOUNTAIN VIEW, Calif. — A few years ago, Google created a new kind of computer chip to help power its giant artificial intelligence systems. These chips were designed to handle the complex processes that some believe will be a key to the future of the computer industry.

On Monday, the internet giant said it would allow other companies to buy access to those chips through its cloud-computing service. Google hopes to build a new business around the chips, called tensor processing units, or TPUs.

“We are trying to reach as many people as we can as quickly as we can,” said Zak Stone, who oversees the small team of Google engineers that designs these chips.

Google’s move highlights several sweeping changes in the way modern technology is built and operated. Google is in the vanguard of a movement to design chips specifically for artificial intelligence, a worldwide push that includes dozens of startups as well as familiar names like Intel, Qualcomm and Nvidia.

And these days, companies like Google, Amazon and Microsoft are not just big internet companies. They are big hardware makers.

As a way of cutting costs and improving the efficiency of the multibillion-dollar data centers that underpin its online empire, Google designs much of the hardware inside these massive facilities, from the computer servers to the networking gear that ties these machines together. Other internet giants do much the same.

In addition to its TPU chips, which sit inside its data centers, the company has designed an AI chip for its smartphones.

Google TPU in data center

Right now, Google’s new service is focused on a way to teach computers to recognize objects, called computer vision technology. But as time goes on, the new chips will also help businesses build a wider range of services, Stone said.

At the end of last year, hoping to accelerate its work on driverless cars, Lyft began testing Google’s new chips.

Using the chips, Lyft wanted to accelerate the development of systems that allow driverless cars to, say, identify street signs or pedestrians. “Training” these systems can take days, but with the new chips, the hope is that this will be reduced to hours.

“There is huge potential here,” said Anantha Kancherla, who oversees software for the Lyft driverless car project.

TPU chips have helped accelerate the development of everything from the Google Assistant, the service that recognizes voice commands on Android phones, to Google Translate, the internet app that translates one language into another.

They are also reducing Google’s dependence on chipmakers like Nvidia and Intel. In a similar move, it designed its own servers and networking hardware, reducing its dependence on hardware makers like Dell, HP and Cisco.

This keeps costs down, which is essential when running a large online operation, said Casey Bisson, who helps oversee a cloud computing service called Joyent, which is owned by Samsung. At times, the only way to build an efficient service is to build your own hardware.

“This is about packing as much computing power as possible within a small area, within a heat budget, within a power budget,” Bisson said.

A new wave of artificial intelligence, including services like Google Assistant, are driven by “neural networks,” which are complex algorithms that can learn tasks on their own by analyzing vast amounts of data. By analyzing a database of old customer support phone calls, for example, a neural network can learn to recognize commands spoken into a smartphone. But this requires serious computing power.

Typically, engineers train these algorithms using graphics processing units, or GPUs, which are chips that were originally designed for rendering images for games and other graphics-heavy software. Most of these chips are supplied by Nvidia.

In designing its own AI chips, Google was looking to exceed what was possible with these graphics-oriented chips, speed up its own AI work and lure more businesses onto its cloud services.

Diagram of Google TPU

At the same time, Google has gained some independence from Nvidia and an ability to negotiate lower prices with its chip suppliers.

“Google has become so big, it makes sense to invest in chips,” said Fred Weber, who spent a decade as the chief technology officer at the chipmaker AMD. “That gives them leverage. They can cut out the middleman.”

This does not mean that Google will stop buying chips from Nvidia and other chipmakers. But it is altering the market. “Who’s buying and who’s selling has changed,” Weber said.

Over the years, Google has even flirted with the possibility of designing its own version of the chips it buys from Intel.

Weber and other insiders question whether Google would ever do this, just because a CPU is so complex and it would be so much more difficult to design and maintain one of these chips. But at a private event in San Francisco last fall, David Patterson, a computer science professor at the University of California, Berkeley, who now works on chip technologies at Google, was asked if the company would go that far.

“That’s not rocket science,” he said.