LLMs can't solve the world's hardest problems. These scientists build new AI models that can.
My conversation with Greg Fallon of Geminus AI
AI adoption in the energy sector is accelerating quickly, but large language models (LLMs) like ChatGPT, Google Gemini, and Claude can’t do everything that the industry needs them to do. While LLMs dominate headlines, the hardest engineering problems in the power industry cannot be solved by these generic models trained on readily available internet data. They require a different kind of model; one built on real world data and the laws of physics.
Last week I sat down with Greg Fallon of Geminus AI to talk about how companies like his are training their own AI models to solve scientific problems that LLMs can’t, like grid simulation and optimization.
This week’s newsletter will dive into our conversation and break down why our grid needs AI, where LLMs fall short, and how scientists are building new models to help us tackle some of the biggest problems the energy industry is facing today.
Why does our grid need AI?
Our electric grid is changing fast, driven by issues like:
A rapid surge in electricity demand, driven heavily by AI data centers.
A slow interconnection process that causes multi-year wait times to install new power generation resources.
A wave of inverter-based resources, adding dynamic, hard-to-simulate behavior to the grid.
Aging substations, transformers, and circuits, many of which were never designed for modern loads.
Why can’t LLMs be used for these complex energy problems?
LLMs are powerful, but they are fundamentally the wrong tool for complex engineering problems like physical system modeling. They only learn from text and images that are readily available on the internet, not real-world sensor data or the laws of physics.
There are a few primary limitations that make LLMs unsuitable for engineering tasks:
1. They don’t understand physical cause and effect
LLMs generate text that sounds plausible, not predictions that obey thermodynamics, circuit behavior, or power-flow equations.
“Getting LLMs to do any sort of reliable prediction is really hard, if not impossible”
Greg Fallon, CEO of Geminus AI
2. LLMs don’t have access to clean engineering datasets
Industrial datasets are proprietary and generally not posted on the internet where an LLM would be able to train on them. Even if they got access, the datasets are usually small, noisy, and incomplete, meaning an LLM may not be able to use that data.
LLMs thrive on massive, clean datasets that the energy industry typically doesn’t have.
“There’s a huge hesitancy to share data in the energy industry. There are even laws in a lot of countries that energy data is so precious that it can’t leave the borders”
Greg Fallon, CEO of Geminus AI
3. Physical assets behave differently in the real world
Two identical transformers operating in different environments will not age the same way. Two identical pumps won’t respond the same under stress. These tiny variations become major failure points and LLMs don’t have the ability to generalize these differences.
“Everything is very particular, that 2% difference can be massively impactful.”
Greg Fallon, CEO of Geminus AI
Scientific AI is the method that can actually solve these problems
Unlike LLMs, scientific AI benefits from several built-in moats:
Enterprise data stays private, and widespread sharing is prevented.
Model behavior must reflect asset-specific quirks, which generic models can’t capture.
Physics models remain core to engineering, meaning simulators will always be in the loop.
Accuracy and reliability matter. Hallucinations are unacceptable for grid decisions.
This method is a different class of AI entirely, designed to augment engineering judgment rather than simply analyze and generate text and images.
Instead of relying on language patterns, scientific models combine:
Simulation data already used by engineers
Real-world sensor data
Physics-informed algorithms
Scientific machine-learning techniques that accelerate modeling
This hybrid approach allows Geminus’ models to behave like fast, accurate surrogates for complex engineering simulations.
“Creating foundational models that can be based on scientific algorithms have more repeatable results.”
Greg Fallon, CEO of Geminus AI
The results are surprising
Greg shared a couple of examples of just how fast scientific AI can unlock value:
Aerospace simulator built in just three weeks
A physics simulator used in aerospace required seven days to compute a single case. Training a traditional ML model with this method would take years.
Geminus reduced the requirement to 12 simulation runs and combined them with additional data to build a functional model in weeks.
Simulation of a massive oil field with over 2,000 wells
A brute-force model would have required nine months of compute time.
Geminus built the model in 4–5 hours, enabling operators to optimize settings and increase output by 15% in a single day.
These kinds of speed-ups open the door for real-time decision support; something that is impossible with today’s slow engineering workflows.
The ultimate goal: an ultra-intelligent engineering assistant
During our conversation, Greg described a future where enterprises have a unified intelligence layer to cover every side of the business; a system that can answer engineering questions instantly by pulling from simulations, models, and historical data.
Imagine typing:
“Is it possible to connect a 100 MW generator at this substation?”
Instead of waiting weeks for interconnection studies, the system generates an accurate, physics-informed response in seconds.
This isn’t a generic chatbot. It’s a domain-specific engineering brain built on physics and scientific AI.
And this isn’t just a pipe dream. The industry is getting close to making this reality.
“We can already do really significant pieces right now.”
-Greg Fallon, CEO of Geminus AI
How this applies to the power industry
Scientific AI has several practical applications for utilities today:
1. Speeding up interconnection studies
New generator interconnections often wait 5–7 years for approval.
Faster models can reduce engineering and regulatory bottlenecks.
2. Modeling inverter-based resources
Solar, batteries, and other inverter-based resources introduce fast dynamics that existing tools struggle to predict.
Scientific models can replicate this behavior more accurately.
3. Understanding grid stress and failure risk
Utilities face constant questions about asset life, overload risk, and weather exposure.
Physics-based models provide clearer insight to improve grid resiliency and reduce power outages.
4. Improving efficiency and reducing emissions
More accurate forecasting and optimization can keep renewables online longer and limit unnecessary fossil fuel power generation.
Who else is working in this space?
Geminus is part of a growing ecosystem of companies training their own AI models for the energy industry. Others include:
Ansys & Siemens Digital Industries – simulation-driven machine learning
WattsUp - developing AI models for predictive maintenance on distributed energy resources, starting with EV chargers
OpenDrawing - developing AI models to automate the electrical project cost estimating and takeoff processes
While their approaches differ, the trend is clear: the future of AI in the energy industry needs to be grounded in science.


