Can ChatGPT really teach you to trade?
'Like a six-year-old using a calculator before learning addition,' professor warns against AI over-reliance

People are already leaning on ChatGPT to explain homework, act like a therapist, and answer medical questions. For many, it feels like an all-purpose guide that never switches off.
Trading is the latest frontier. New investors are asking ChatGPT for strategies, market analysis, and shortcuts to complex concepts. Yet experts caution that relying on unverified AI advice in such volatile markets may turn education into expensive error.
How traders use ChatGPT
“ChatGPT and other LLMs are great educational tools, able to create visualizations, hold conversations, and adapt answers to your background,” Michael Kampouridis, Faculty Dean (Postgraduate) for the Faculty of Science and Health at University of Essex, told The Crypto Radio.
However, “They’re tools to support informed decisions, not replace human input,” he warned. “It can compile and execute but cannot avoid logical errors.”
ChatGPT is often used in trading to analyze news headlines, generate trade ideas, summarize economic reports, and even script algorithmic strategies.
Euan Sinclair, Portfolio Manager at Hull Tactical Asset Management, discussed the limitations of ChatGPT for beginners in trading, emphasizing the need for specific questions to avoid broad, ineffective queries.
“It’s very good at answering specific questions, but if you just say: ‘teach me trading’, that’s too vague,” Sinclair told The Crypto Radio, highlighting how beginners don’t know what resources to seek out.
He also argued how technical analysis has no predictive ability. “Goldman Sachs [would] have a technical analysis trading desk if it really worked.”
Regardless, Sinclair said ChatGPT proves most useful for specific tasks – writing Python scripts, drafting blogs, or locating research papers – where it democratizes intellectual access, much like Google simplified library research.
Where AI falls short
Sinclair warned that ChatGPT can provide incorrect answers, and verifying every detail is often impractical.
“I was watching a medical drama and wondered when the first appendicitis operation happened. I asked ChatGPT, and it gave me an answer. Later, Wikipedia showed a different one, so I asked again. It replied, ‘Oh yes, you’re correct!’ – but it should have known that in the first place,” he said.
Kampouridis stressed that ChatGPT should never be mistaken for foolproof: “It can compile and execute but cannot avoid logical errors.”
These gaps in accuracy and reasoning make it risky for traders to depend on ChatGPT without verification or deeper knowledge.
Lessons from classrooms
Classrooms show how AI can help – but also how over-reliance risks skipping core skills. Photo: Unsplash / Nathan Cima
Findings from the American School District Panel from April 2025 showed growing adoption among students for educational purposes. The study collected survey data from Fall 2023 and 2024 and conducted interviews with 15 district leaders to understand how schools are approaching AI-related training and decisions.
It found that 48% of U.S. school districts are training teachers on AI by fall 2024, up 25 percentage points from prior years. This rapid rise reflects how quickly AI is becoming embedded in learning environments.
For Kampouridis, the trend carries risks. “At my computer science department in university, students use ChatGPT from day one – before learning coding basics,” he said. “It’s like a six-year-old using a calculator before mastering addition – they won’t learn properly due to over-reliance. The same goes for ChatGPT users,” he stressed.
He also argued that education itself provides the antidote to this problem. “Education helps users avoid biases by teaching them to question and verify recommendations,” he said.
Just as teachers worry students may skip foundational knowledge, traders risk skipping the hard lessons of markets if they lean too heavily on ChatGPT for guidance.
Bias and blind spots
Kampouridis highlighted how ChatGPT and other LLMs can amplify confirmation bias or FOMO, especially when users don’t know how to frame prompts or seek validation over truth.
He advised traders to keep a trading journal and log their reasoning before entering trades, to ask ChatGPT to review past trades, without post-trade knowledge, and see what advice it gives them.
To start learning, Kampourdis advised beginners to use simple and specific prompts and avoid ambiguity that makes ChatGPT wander. He also suggested telling ChatGPT how much detail to provide – whether a short definition, brief answer, technical explanation, or mathematical formula.
“For simple definitions like technical analysis, clear and simple prompts are key,” he explained.
ChatGPT: guide, not teacher
Sinclair suggested treating ChatGPT less like an authority and more like a helpful friend – useful, but not always right.”
“It’s hard to blame ChatGPT if traders lose money due to poor instructions,” he noted. “I don’t know legit trading firms relying on it that much – maybe some retail traders. Like any tool, you can’t blame ChatGPT more than Google or Excel,” he added.
“In the future, this may change – ChatGPT’s improved a lot in two years. But now, no one uses it as core to their operations.”
For Kampouridis, the risk is not in using ChatGPT, but in leaning on it too much. “[LLMs] are great if you know what you’re doing, but over-reliance without knowledge means you won’t learn and will ultimately fail,” he stressed.
Together, their views frame ChatGPT as a helpful guide for traders – but not a teacher.