This is so hot ! Thanks 😋
https://en.m.wikipedia.org/wiki/Bullet_Cluster
This is so hot ! Thanks 😋
https://en.m.wikipedia.org/wiki/Bullet_Cluster
the new compound is :
“… molecular hybrid between (S)-lactate and the BHB-precursor ( R)-1,3-butanediol in the form of a simple ester referred to as LaKe…”
capabilities:
High-temperature heat pumps, which can potentially deliver temperatures between 100°C and 200°C to the industry, are being implemented in many European companies.
The article does not provide figures about how much equivalent insulation this infrared controlling layer produces.
They do not either provide estimates for the cost // price.
paywalled
and is likely to remain so.
Well, in fact I don’t care at all for that last statement of mine. So, if this is all you disagree about my reading of the article then it’s fair game for me.
Original source (free access) :
https://onlinelibrary.wiley.com/doi/10.1002/advs.202303835
So, if I read it correctly, they do not modify the fiber so the training information would be store in the fiber.
They do not have light that can learn by itself either … instead, what they do is they notice that a very reproducible noise pattern is created and they are training a machine outside of the optical fiber to recognize which part of this noise could be interpreted as information … all of this is in fact very power costly, … and is likely to remain so.
Edit : I removed my last statement because I don’t want to start bickering about sterile nonsense.
typo in title : ine instead of mine
souce has this title now :
Mistral CEO confirms ‘leak’ of new open source AI model nearing GPT-4 performance
2 excerpts :
Mistral co-founder and CEO Arthur Mensch took to X to clarify: “An over-enthusiastic employee of one of our early access customers leaked a quantised (and watermarked) version of an old model we trained and distributed quite openly…
To quickly start working with a few selected customers, we retrained this model from Llama 2 the minute we got access to our entire cluster — the pretraining finished on the day of Mistral 7B release. We’ve made good progress since — stay tuned!“
Quantization in ML ((machine learning)) refers to a technique used to make it possible to run certain AI models on less powerful computers and chips by replacing specific long numeric sequences in a model’s architecture with shorter ones.
Well, t.i.l., thanks 👍
You used the microwave ? 🤨
Hi ruffsl,
thanks, i like your post :)
Please note your 1st link is faulty :
“ttps://en.wikipedia.org/wiki/Wave_function_collapse”
Here the “h” of “https” is missing !
differentiation between the characters I, l, and 1.
differentiation between the characters I, l, and 1.
… and betweem “rn” and “m” and betweem "rn" and "m"
Mastering the software is easy part. Doing something useful with it by understanding the outside world and doing real work is something else.
Thanks for many interesting technology posts.