ChatLGBT is the best A (artificial, not artificial inteligence).
How to run?

REGARDLESS OF PLATFORM, PLEASE INSERT THE CHATLGBT EXECUTABLE IN AN EMPTY FOLDER, BECAUSE IT WILL GENERATE A memory.txt FILE!
I RECOMMEND PLACING A READY memory.txt FILE IN THE SAME FOLDER AS CHATLGBT, YOU CAN DOWNLOAD THE memory.txt FROM CHATLGBT ONLINE USING THIS LINK:
(Dead link) (right click and save site as)



If you are on Windows, just extract and double click the ChatLGBT_Windows executable.

If you are on Linux/MacOS, just run ChatLGBT_Linux/MacOS in the terminal using the command './ChatLGBT_Linux/MacOS'
If that doesn't work try running 'chmod +x ChatLGBT_Linux/MacOS'

The Mac binary works from MacOS 11 and onwards, should work on both Silicon and Intel, without using Rosetta.

If you are on any other platform, compile ChatLGBT by yourself, using g++ and the command:
'g++ -O3 ChatLGBT.cpp -o (output file) -static-libgcc -static-libstdc++'
or use the Python version (also MicroPython compatible)

TROUBLESHOOTING:
Q:ChatLGBT (executable) can't read some characters, what do i do?
A:Just use the Python/Online version OR use this on the whole memory.txt file: https://onlinetools.com/unicode/normalize-unicode-text

Q:I say something and the bot just responds with blank, what do i do?
A:Please use at least two word sentences, the bot can't learn from one word responses

Q:It doesn't work on MicroPython, what do i do?
A:Please update MicroPython to the latest version, you are running an outdated version of MicroPython.
