There are so many definitions of OOP out there, varying between different books, documentation and articles.
What really defines OOP?
There are so many definitions of OOP out there, varying between different books, documentation and articles.
What really defines OOP?
Dude, you’re going to shit bricks when you realize most computer science jargon is just marketing buzzwords on top of marketing buzzwords and the terms never meant anything more or less it needed to sell a product.
For example, what the hell is big data? What is a scripting language? Is your DB web scale?
Once upon a time, “big data” was datasets large enough that it was impractical to try to store or work with them in a traditional relational database software. Which is where distributed structures came into play with the ability to spread both storage and computation across clusters of machines, using solutions like Hadoop and MongoDB. That seemed to be the direction things were heading 10 - 15 years ago.
However, with the automated scaling built into modern cloud databases, the line has gotten a bit blurry; Snowflake, Redshift, BigQuery all handle many billions of rows just fine. I probably wouldn’t use the term big data in a professional context these days, but there is a table size after which I write code a bit more carefully.
I suppose my point is that the term once meant something, but marketing stole it because it sounds cool. I worked in a tech shop in the late aughts where the sales team insisted on calling every rack mounted server a “blade server”, regardless of whether it had modular swappable boards. Because it sounded cool.
How I remember it is that it’s not even the whole dataset that is too large, but the individual records. Hadoop for example is not doing anything magic, it’s just a software package to extend MySQL to be able to efficiently have pictures (Facebook’s original use case, of course it evolved) as records.
I guess big data is what you need it to justify what you want to justify. In one of my gigs’ case, it was public funding for a project.
deleted by creator
That’s all of CS and IT.
Big data is when we align our agile synergies at scale.
Wait, so, the Cloud is actually just a bunch of other computers, called servers, and the only real innovation is basically a load balancing system?
Next youre gonna tell me I wont be able to stream lagless video games and also do competitive multiplayer on my Google Stadia, pff, like youre some kind of expert or something.
/s
Negative latency!
God I still cannot believe how obviously bullshit that all was and how many fucking idiots parroted it hook line and sinker.
Google casually violating causality