I know the typical answer is “no because all the libs are in python”… but I am kind of baffled why more porting isn’t going on especially to Go given how Go like Python is stupid easy to learn and yet much faster to run. Truly not trying to start a flame war or anything. I am just a bigger fan of Go than Python and was thinking coming in to 2024 especially with all the huge money in AI now, we’d see a LOT more movement in the much faster runtime of Go while largely as easy if not easier to write/maintain code with. Not sure about Rust… it may run a little faster than Go, but the language is much more difficult to learn/use but it has been growing in popularity so was curious if that is a potential option.
There are some Go libs I’ve found but the few I have seem to be 3, 4 or more years old. I was hoping there would be things like PyTorch and the likes converted to Go.
I was even curious with the power of the GPT4 or DeepSeek Coder or similar, how hard would it be to run conversions between python libraries to go and/or is anyone working on that or is it pretty impossible to do so?
I’m mostly using Bash and Perl.
yep and the future is optimizers, custom compiled cuda kernels, and more asic chips (eventually). python is just the glue that is commonly used. it’s good glue though, but there are other glues…
This is quite simple for me… I only know python and very small amounts of JavaScript/html/ and css. More important than efficiency gains is just me getting the job done which really is an efficiency gain in itself.
OK… so that’s fair, but I would counter with… if Go/Rust were going to increase the runtime performance of training/using the AI models by a factor of 2, 3 or more, and the time to learn Go is a couple weeks and Rust a couple years (kidding… sort of), if you’re job is for years to come doing this sort of work and the end result of say, Go is training much faster or doing data prep much faster… wouldnt the benefits of learning Go or even Rust be worth the exponential increase in time savings for training/running, as well as memory efficiency, less resources needed, etc?
Not saying you should, cause I don’t even know if Go/Rust/Zig will result in much faster training/etc. I would assume if that were the case, then company’s like OpenAI would have been using these languages already since they had the money and time to do so.
I’m not exactly a top tier programmer so anything I make is lucky to work. I would always consider using the best language for the job though given the resources so ya.
I use C#. Initially I’d gone all out trying to wrap Llama.cpp myself, but I was getting outdated in a matter of weeks and it was going to take a ton of effort to keep up.
So instead I run a local ooba server and use the api. I get to do all my business logic in nice, structured C#, while all the python stuff says in ooba and I don’t have to dig into it really at all.
I saw an interesting article somewhere that showed you can be a lot more memory efficient doing inference with Rust, since you dont have several GBs of python dependencies.
We are using rust (webassembly) in Edgechains. Looking for contributors to help build the AI webassembly runtime!