Sep 25, 2025

AI Platforms Are Reshaping Pharma Logistics at Bayer

Nicole Hemsoth Prickett

AI Platforms Are Reshaping Pharma Logistics at Bayer

Bayer outlines how internal AI platforms and edge automation from smart labels, drones, and IoT sensors, can shift pharma warehousing from reactive operations to predictive compliance.

When it comes to almost anything in the world of logistics, there is a staggering degree of complexity in that behind-the-scenes act of getting your package from point A to point B. 

Those complexities quickly turn exponential for pharma companies as they move drug therapies with six-month shelf lives across a global cold chain with every action documented for regulatory compliance. 

At Bayer’s Berkeley facility, a $200 million biotech site focused on hematology, cell and gene therapy, and neurodegenerative research, this system still runs on fragmented workflows, paper records, and reactive operations. But as Prashanth Chakravartula who leads warehousing and logistics at Bayer put it, “we do have a very reactive approach, even though it is customer driven… it is reactive in terms of our operations.” 

He told us at the AI Infra Summit that thirty percent of employee time is spent on manual data entry. In an industry defined by precision, that number is both an inefficiency and a risk.

Chakravartula showed how edge AI and automation are being combined to address this kind of problem, despite all the complexities. 

The model he describes is layered: automation first, intelligence second. Unless the flow of goods is instrumented, tagged, and continuously measured, AI has no baseline to work against. But once that exists, AI agents can shift compliance from punitive to preventative, and logistics planning from reactive to predictive.

He says that Bayer has been developing its own internal AI environment rather than waiting for external vendors to fit life sciences needs. MyGenAssist is one of the internal in-house developed AI platforms, which is an LLM tuned for life sciences pros. As he describes, it’s not a chatbot front-end but a framework with tens of thousands of users across pharmaceuticals, consumer, and crop science units that can spin up agents tailored to their functions. 

“We have around 12,000 plus agents available for all kinds of use ranging from deep research agents, workflow automation,knowledge management, advice and chat,” he explains. 

Bayer has 50,000 plus users and 25,000 active users a month. “This is showing the traction that we’re having with such an internal, developed solution,”  Chakravartula says, and to him, it proves the platform is not just a pilot or side project but a system embedded in day-to-day work. 

The architecture is modular with large language models from external providers that can be pulled in, though the agents themselves live inside Bayer’s controlled environment. Scientists can “talk to the data” directly rather than building one-off analysis pipelines.

Another internally developed AI tool called Project Maya extends this to the production floor. It is the conduit for manufacturing data (MES records, equipment telemetry, laboratory outputs, etc.) into the MyGenAssist environment. 

“Maya essentially enables the users to talk to the MES data, for example, the equipment data, when they have experiments running in the labs with the laboratory data”he says. In practice, that means an engineer can interrogate the performance of a specific reactor, or analyze deviations in time series logs, without waiting for a bespoke integration. The design principle is modularity. APIs and services are built as reusable building blocks, so scaling from one site to global coverage is an architectural choice, not a reinvention.

All of that AI infrastructure depends on a reliable, real-time flow of data from warehouses and logistics systems. 

At the Berkeley site, the problem set is dense with both inbound and outbound flows, cold storage with different temperature regimes, strict shelf-life tracking, supplier coordination, and FDA-grade documentation that must be captured at the exact moment material moves. Failure is measured in wasted product and regulatory exposure.

The first requirement is automation. As Chakravartula explained, the bottleneck today is human effort. “30% of the employee time has gone into manual data entry,” he says, adding that the interventions being piloted are designed to strip that burden away and replace it with machine-captured data that feeds forward into AI systems.

Smart labels are the starting point. Every package moving through the system can carry metadata including location, time stamps, process history, and temperature exposure. That information can be read instantly, eliminating the need for operators to reconcile multiple paper logs.

Document digitization is another vector he cites. “The OCR [optical recognition} technology has been there for a very long time. But the traction of the OCR technology using the AI agents has been a proven methodology.”

The idea is to bring paper compliance documents into digital form without losing regulatory fidelity via the ink, the date formats, the signatures. AI agents, he says can then perform electronic verification, ensuring that requirements are met in real time instead of during audits.

Inventory management is being tested with drones. “They have drones in today’s market, where they have retrofitted the drones with about 14 to 16 cameras and four to six barcode readers, so they are autonomous drones which can move around the warehouse and they can do the inventory counts themselves.” 

This replaces manual counts and frees operators from repetitive scanning work, while generating continuous streams of inventory data that can be cross-checked against demand forecasts.

Chakravartula also outlined how IoT sensors extend the visibility to global transport. “There are IoT sensors in today’s world that we can use whenever we have our vehicles shipped overseas, through ocean, then we can track the conditions of those materials. As soon as they reach the port, we are immediately able to download the data.” These devices monitor temperature, humidity, acceleration, and light exposure, building a digital record of environmental conditions across the entire cold chain.

With automation producing near real-time data, AI can be applied upstream instead of downstream. Compliance, for example, can be reimagined. “If we could create AI agents which can absorb the repository of the standard operating procedures, then the AI agents can hand hold.” In other words, instead of penalizing operators after a mistake, agents guide them step by step, flagging deviations before they happen.

Planning can also be reframed, he argues. “If the system, or if the AI agent can conduct simulations for different scenarios based on different demand signals, that could give us the opportunity to have better predictability of our processes.” This is where digital twin models and demand-signal analysis intersect: edge data feeds into scenario simulations, which in turn inform scheduling, storage allocation, and outbound planning.

“One of the most important things is that as the process is being conducted, as the material is moving, the data needs to be generated in the back end.” 

Bayer’s wager is that an internal AI platform, already scaled to tens of thousands of users, can be fused with edge automation technologies to build a more predictive, operator-centric system. The goal is not to remove humans from the loop but to free them from manual data entry and compliance busywork so they can focus on higher-order tasks.

And for Bayer, it is about how to engineer AI infrastructure that can support real-world constraints where every delay, every error, and every missing data point carries consequences.

Subscribe and learn everything.
Newsletter
Podcast
Spotify Logo
Subscribe
Community
Spotify Logo
Join The Cosmos Community

© VAST 2025.All rights reserved

  • social_icon
  • social_icon
  • social_icon
  • social_icon