By Amir Khoshniyati, VP of Marketing, Wiliot
I spent last week at MODEX 2026 in Atlanta, and I kept coming back to the same observation – my conversations this year felt more specific than years prior.
What I mean by that is, a year or two ago, much of the language around AI, automation, and visibility still felt broad and exploratory. People understood which technologies mattered. They knew AI was going to reshape operations. But the details – what to do, how to implement, how to measure impact, and where these systems should actually be deployed – were still taking shape.
At MODEX, things felt different this year. The level of understanding has matured. Conversations were less high-level and much more grounded in specificity and execution – focused on where data originates, how quickly it can be validated, and what decisions systems can make once that data is available.
That shift reflects a fundamental and important level of market maturation. The ambition hasn’t changed, but the discussion has moved closer to execution – grounded in how these new technologies actually operate at scale, and how quickly they can be deployed.
What stood out most here was how AI showed up in those conversations. It wasn’t framed as a capability in isolation – it was tied directly to workflows. Planning systems that adjust in real time. Warehouse operations that dynamically reprioritize work. Exception management that triggers action without waiting for manual intervention. The focus was on how AI performs inside the flow of operations, not outside of it.
That lines up closely with what we’re seeing in our day-to-day work at Wiliot. And the more specific these conversations become, the more they converge on a core requirement: AI is only as effective as the data it receives. And in supply chains, that means continuous, reliable data from the physical world.
When supply chains can continuously capture signals from products and assets – location, temperature, humidity, movement, dwell time – they can move beyond static snapshots and begin operating in real time. Decisions don’t wait for the next scan or update; they happen as conditions change. That distinction – and its importance – came through clearly in many of my conversations at MODEX.
Warehouse automation was also a major focus at the 2026 show. But what stood out wasn’t just the technology – it was how people are thinking about deploying it. Conversations centered less on massive, all-at-once scale and more on accessibility and speed to value.
Systems like autonomous mobile robots, automated storage and retrieval, and conveyor and sortation are being designed to roll out in phases, integrate into existing operations, and deliver impact without requiring a full network overhaul. This model is expanding adoption across the market, with strong momentum among both large enterprises and, notably, a growing number of small and mid-sized operators.
That trend mirrors what we’re seeing with Physical AI adoption. Early deployments were concentrated among the largest operators – companies with the scale to invest early and push new models into production. What’s changing now is broader interest across the market.
There’s competitive pressure to keep up with those leaders, but there’s also a growing understanding that you don’t need to be a global giant to benefit from Physical AI. The barrier to entry is coming down, and the use cases are becoming clearer.
The same theme of specificity also showed up in how people talked about visibility. For years, visibility has meant tracking where something was the last time it was scanned. That still has value, but it doesn’t provide the level of data specificity or granularity that AI-based supply chains increasingly require. Last-scan data tells you what happened. It doesn’t give systems enough context to understand what’s happening now – or what to do next.
What companies are pushing toward now is continuous visibility – a live, real-time view of inventory, assets, and conditions as they change. That kind of data is where the market is headed – and what supply chain executives are most keen to have.
That shift is exactly what we’ve been building toward at Wiliot, through our Physical AI platform, our battery-free IoT Pixels, and partnerships with platforms like Databricks that can ingest and process large volumes of real-time, physical-world data.
Encouragingly, those same themes carried into the board meeting at the Auburn University RFID Lab later that week. Auburn has spent more than two decades helping retailers and brands deploy RFID at scale – driving standards, testing methodologies, and real-world implementations across the industry. Today, that work is expanding to include broader sensing technologies across the supply chain.
That evolution is particularly important today. The focus is expanding from identification to richer, continuous data about physical assets – capturing condition, movement, and context, and making that data usable across enterprise systems.
The alignment across MODEX, the Auburn discussions, and what we hear every day at Wiliot is hard to ignore. There is growing consensus around the need for stronger standards, more dynamic data models, and cloud-connected architectures that allow information to move freely across systems. Efforts like GS1 Digital Link and EPCIS are also moving in that direction, making product identity and event data more accessible and interoperable across partners.
For us, that alignment is validating. We’ve been investing in Physical AI because we believe supply chains need a continuous data layer – not more disconnected events. Last week reinforced that view.
The questions being asked are more specific now, and more grounded in execution: What can I see? What can I trust? What can I automate? That level of clarity reflects a deeper understanding across the industry – and it’s what will drive real, measurable progress in how supply chains operate in 2026.