Home
JAQForum Ver 24.01
Log In or Join  
Active Topics
Local Time 09:07 02 Feb 2026 Privacy Policy
Jump to

Notice. New forum software under development. It's going to miss a few functions and look a bit ugly for a while, but I'm working on it full time now as the old forum was too unstable. Couple days, all good. If you notice any issues, please contact me.

Forum Index : Microcontroller and PC projects : More AI magic with Dave's Garage

     Page 1 of 2    
Author Message
LeoNicolas

Guru

Joined: 07/10/2020
Location: Canada
Posts: 558
Posted: 01:00pm 16 Dec 2025
Copy link to clipboard 
Print this post

From Zero to READY: Recreating C64 BASIC in Visual Studio Code
 
Volhout
Guru

Joined: 05/03/2018
Location: Netherlands
Posts: 5648
Posted: 03:20pm 16 Dec 2025
Copy link to clipboard 
Print this post

Impressive,

Although the small 4 line program showed a problem (TAB missing), the debugging of larger basic programs will show more. No doubt.

But it is impressive that it is possible at all. With sufficient focus and money (payed AI licenses ?) you might get C64 basic in few days. So a 13y old whizzkid with time to spare could create MMBasic (Geoff and Peters work, and many others, that took 14 years 2011-2025) in a week or so.

Terrifying....

Volhout
PicomiteVGA PETSCII ROBOTS
 
matherp
Guru

Joined: 11/12/2012
Location: United Kingdom
Posts: 10923
Posted: 04:33pm 16 Dec 2025
Copy link to clipboard 
Print this post

  Quote  So a 13y old whizzkid with time to spare could create MMBasic (Geoff and Peters work, and many others, that took 14 years 2011-2025) in a week or so.


No: not without understanding exactly what he was trying to achieve and the environment in which he was trying to achieve it. As I said in the previous thread, the role of the programmer is dead but the roles of systems analyst, systems architect and designer are now the crucial areas and needed skills
Edited 2025-12-17 02:34 by matherp
 
karlelch

Guru

Joined: 30/10/2014
Location: Germany
Posts: 314
Posted: 06:33pm 16 Dec 2025
Copy link to clipboard 
Print this post

  matherp said  ... As I said in the previous thread, the role of the programmer is dead but the roles of systems analyst, systems architect and designer are now the crucial areas and needed skills

I fully agree
 
robert.rozee
Guru

Joined: 31/12/2012
Location: New Zealand
Posts: 2489
Posted: 09:07pm 16 Dec 2025
Copy link to clipboard 
Print this post

while i am not a fan of the 'AI revolution', the results in Dave Plummer's video are really impressive. although, i do wonder if the AI started out with a synthesis from various source code for BASIC interpreters out on the net?

here is a question for Peter and/or Geoff: if you took the source code for the Micromite MKII MMBasic (ie, MX170) and presented it to one of these AI's with instructions to optomize the source code such that the compiled binary was (a) no less efficient wrt to speed, (b) minimized wrt the generated binary size, what would the result be?

the reason i suggest the MX170 version is that it is feature-complete and well-tested, while already constrained wrt binary size. if the AI could squeeze 10k or so of extra flash space out of it, then there may be some possibility to, for instance, add in onboard SD card support. or double precision FP maths.


cheers,
rob   :-)
Edited 2025-12-17 07:08 by robert.rozee
 
JohnS
Guru

Joined: 18/11/2011
Location: United Kingdom
Posts: 4221
Posted: 09:26pm 16 Dec 2025
Copy link to clipboard 
Print this post

Rob, I think that's too hard - currently.

The code could be squashed a bit (now, by hand) but would (I feel) be less and less readable/maintainable.  Getting 10K? My feeling is no chance.

John
 
phil99

Guru

Joined: 11/02/2018
Location: Australia
Posts: 2963
Posted: 09:48pm 16 Dec 2025
Copy link to clipboard 
Print this post

  Quote  but would (I feel) be less and less readable/maintainable.
Almost certainly less human readable/maintainable, but once you go down the AI path is that necessary?
If needed AI can be used to explain how the resulting code works and do the maintenance when required.

Edit.
It would be a bit like Crunching a MMBasic program.
You would still have the source code and can re-do the whole process if there are serious problems.

AI can also be used to create a suite of test programs to ensure no new bugs are introduced.
Edited 2025-12-17 08:02 by phil99
 
Grogster

Admin Group

Joined: 31/12/2012
Location: New Zealand
Posts: 9863
Posted: 12:24am 17 Dec 2025
Copy link to clipboard 
Print this post

I rather like Dave's channel, and I watch a few of his videos, including this one.
Glad you thought enough of it, to link it here.  
Smoke makes things work. When the smoke gets out, it stops!
 
LeoNicolas

Guru

Joined: 07/10/2020
Location: Canada
Posts: 558
Posted: 01:31pm 17 Dec 2025
Copy link to clipboard 
Print this post

There is another video where he creates an entire new Notepad app using AI.

https://youtu.be/bmBd39OwvWg?si=1oRDqHgpkVV-SWYN

I agree with Peter. I've been using AI to speed up my work, helping with the development of some algorithms, APIs, unit tests, etc. I like to use the analogy of building an airplane. We cannot just ask an LLM to create an entire air plane project, with its hundreds of thousands of parts, but we can use an AI to help with each part. The architects and engineers are responsible for the idea, the instructions, and the overall picture and to put all together, and the AI can help with each part.
 
pwillard
Guru

Joined: 07/06/2022
Location: United States
Posts: 334
Posted: 06:45pm 17 Dec 2025
Copy link to clipboard 
Print this post

The thing is, if you give the Chatbot some meat to chew on, it will generally produce decent results as compared to starting from scratch with just your initial prompting.

I recently asked the Chatbot to help me write an image converter in Python, knowing that it has some decent image mangling tools like the PILLOW library.

I found an image format descriptor workup in C# language (a language I know little about). Still, after feeding that header file and related C# file, it produced a working Python script that could convert proprietary Microsoft (read: mostly undocumented) image format to PNG files for editing with just a couple of prompt iterations.

Amazing, actually.
 
PeteCotton

Guru

Joined: 13/08/2020
Location: Canada
Posts: 602
Posted: 11:09pm 19 Dec 2025
Copy link to clipboard 
Print this post

  LeoNicolas said  I agree with Peter. I've been using AI to speed up my work, helping with the development of some algorithms, APIs, unit tests, etc. I like to use the analogy of building an airplane. We cannot just ask an LLM to create an entire air plane project, with its hundreds of thousands of parts, but we can use an AI to help with each part. The architects and engineers are responsible for the idea, the instructions, and the overall picture and to put all together, and the AI can help with each part.


I came here to say the same thing, but Leo put it far more eloquently than I could. AI is amazing for small functions. It's useful for mid size programs. It's useless at large systems.
 
lizby
Guru

Joined: 17/05/2016
Location: United States
Posts: 3580
Posted: 01:17am 20 Dec 2025
Copy link to clipboard 
Print this post

  PeteCotton said  AI is amazing for small functions. It's useful for mid size programs. It's useless at large systems.


I'd agree, but would add: "at present". Two years ago it wasn't good at those things, and a year ago, hallucinations were much more of a problem than they are now.

Part of the problem is that it takes a lot of work to define the scope of a "large system" well enough for AI to tackle it.
PicoMite, Armmite F4, SensorKits, MMBasic Hardware, Games, etc. on fruitoftheshed
 
PeteCotton

Guru

Joined: 13/08/2020
Location: Canada
Posts: 602
Posted: 06:45am 22 Dec 2025
Copy link to clipboard 
Print this post

  lizby said  Part of the problem is that it takes a lot of work to define the scope of a "large system" well enough for AI to tackle it.


100% agree. But that's also the problem with large programs using human developers. Defining the design is a huge task.

I do think AI could also help quite a bit with the design phase, though. I have a potential project coming up next year to modernise a program I wrote 15 years ago. I'm interested to see how much help AI can be with the update.
 
lizby
Guru

Joined: 17/05/2016
Location: United States
Posts: 3580
Posted: 03:43pm 22 Dec 2025
Copy link to clipboard 
Print this post

  PeteCotton said  a potential project coming up next year to modernise a program I wrote 15 years ago. I'm interested to see how much help AI can be with the update.


Given a working program and specs for an update, I imagine AI would do a very good job.

One of my former managers would say, "The best documentation is a working program". That's even more true now, since AI can now probably look at the program and abstract a specification (human-readable documentation).
PicoMite, Armmite F4, SensorKits, MMBasic Hardware, Games, etc. on fruitoftheshed
 
Volhout
Guru

Joined: 05/03/2018
Location: Netherlands
Posts: 5648
Posted: 08:10am 23 Dec 2025
Copy link to clipboard 
Print this post

Watching a bit from the side line, learning to use Ai slowly.

I am really curious how long AI will be "subsidized", in other words "semi free to use".
There are subscriptions, like a few dollars per week. But since there is serious benefit in using it (you can fire a lot of software developers) the subscription cost will explode this decenium. I expect. And somewhere the advertisements will creep in, could be in biasing, could be like youtube.. you have to watch this advertisement before you can continue... or pay...

Volhout
Edited 2025-12-23 18:11 by Volhout
PicomiteVGA PETSCII ROBOTS
 
Mixtel90

Guru

Joined: 05/10/2019
Location: United Kingdom
Posts: 8486
Posted: 08:22am 23 Dec 2025
Copy link to clipboard 
Print this post

Very likely, I suspect. Someone is going to have to pay for the hardware and incredible energy usage at some point - and big business won't absorb it all forever. Shareholders and venture capitalists need to be paid, with decent profits. It's just that at the moment there isn't really a framework for charging small users yet. I'm pretty sure it will come.

However, the use of local AI systems rather than internet connected ones, although far less capable, will probably boom.
Mick

Zilog Inside! nascom.info for Nascom & Gemini
Preliminary MMBasic docs & my PCB designs
 
lizby
Guru

Joined: 17/05/2016
Location: United States
Posts: 3580
Posted: 02:05pm 23 Dec 2025
Copy link to clipboard 
Print this post

  Volhout said  I am really curious how long AI will be "subsidized", in other words "semi free to use".


Too much competition, I believe. I subscribe to Gemini and ChatGPT-5 for $20 a month, and use perplexity.ai without a subscription (despite the nagging, it appears to continue to provide good answers even as it says I need to move to the Pro version). I haven't used the open-source versions, but they are said to be quite good.

I have no income-earning use for AI, but consider $40 a month to be an incredible value for what is provided. 50+ years ago I said I wanted the library to come to me. Now it does. And what's more, with AI, even the professors come to me.

  Mixtel90 said  Someone is going to have to pay for the hardware and incredible energy usage at some point - and big business won't absorb it all forever. Shareholders and venture capitalists need to be paid, with decent profits. It's just that at the moment there isn't really a framework for charging small users yet. I'm pretty sure it will come.

However, the use of local AI systems rather than internet connected ones, although far less capable, will probably boom.


Token prices continue to fall significantly. More recent figures are hard to come by, but perplexity says "Google’s clearly stated, large percentage reductions (50–80%) mostly refer to the mid‑2024 to October‑2024 period". Elsewhere I have read that costs for responding to prompts from retail users are one-sixth what they were a couple of years ago.

Business users who stand to gain much in greater productivity or lower costs (through the use of fewer employees) pay much more than retail customers (though not necessarily on a per-token basis).

AI may follow the price curve of computer hardware--more and more capability at a lower and lower cost. There's likely to be a lot of low-hanging fruit.

(On the professional level, my friend the mediation lawyer says that while he checks everything, a legally-trained AI offers him a skilled research assistant at a fraction of the cost of a human, and with results delivered much more quickly.)
PicoMite, Armmite F4, SensorKits, MMBasic Hardware, Games, etc. on fruitoftheshed
 
Mixtel90

Guru

Joined: 05/10/2019
Location: United Kingdom
Posts: 8486
Posted: 03:52pm 23 Dec 2025
Copy link to clipboard 
Print this post

Prices are falling, but IMHO it isn't because AI is becoming cheaper. It's more of a case of making some money back to prove to investors that they haven't lost out already. It only needs one to get edgy and others may also start to pull out. There is a real chance that it's an AI "bubble" and that a lot of people will lose a lot of money on it.

It's still too early to tell, but if the big data barns don't do something about their appetite for energy then something *will* give way as they won't be able to expand. Without expansion they aren't seen as being a good financial risk. There won't be any new data development in some parts of the UK at the moment as it is, the waiting list for big enough connections is very, very long and the total available generating capacity is too small for the demand.

Available power may be the crunch. Governments will decide how much of a countries land area can be given up to data barns and the power for them, not big business.
Mick

Zilog Inside! nascom.info for Nascom & Gemini
Preliminary MMBasic docs & my PCB designs
 
matherp
Guru

Joined: 11/12/2012
Location: United Kingdom
Posts: 10923
Posted: 04:43pm 23 Dec 2025
Copy link to clipboard 
Print this post

Energy requirements per token are also falling. Just installed a RTX 4000 Ada SFF in my computer. 70W maximum, no extra power required and faster than my previous A4000 which required 140W. Competition is coming for Nvidia from AMD and Intel and watts per token will be a major selling factor.
 
lizby
Guru

Joined: 17/05/2016
Location: United States
Posts: 3580
Posted: 06:28pm 23 Dec 2025
Copy link to clipboard 
Print this post

  Mixtel90 said  There is a real chance that it's an AI "bubble" and that a lot of people will lose a lot of money on it.


True (and maybe inevitable for "a lot"), but that has nothing to do with the eventual capability and eventual consumer cost of AI.

Railroads in the U.S. after the Civil War were a huge bubble, but more than paid off for the country as a whole and to those who purchased railroads out of bankruptcy. Those who built the skyscrapers of New York and Chicago went bankrupt, but those assets were profitable to the cities and the investors who scooped them up.

Of “IBM and the Seven Dwarfs” (IBM, Burroughs, UNIVAC, NCR, CDC, Honeywell, RCA, GE), who remains standing?

Lots of money was lost in the tech crash of 2000, but the crash doesn't register on a graph of Moore's Law. The internet buildout companies crashed (Lucent was once the most highly valued company in the world), but the internet succeeded.

~
Edited 2025-12-24 04:31 by lizby
PicoMite, Armmite F4, SensorKits, MMBasic Hardware, Games, etc. on fruitoftheshed
 
     Page 1 of 2    
Print this page
The Back Shed's forum code is written, and hosted, in Australia.
© JAQ Software 2026