• 0 Posts
Joined 1Y ago
Cake day: Jun 13, 2023


I was quite literally illustrating the absurdity by being similarly absurd. Telling people to shut the fuck up about an issue is funny as hell to respond with a similar statement.

We measure success by how many GB’s we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.

Using satire to convey a known truth some already understand implicitly, some don’t want to acknowledge, some refuse it outright, but when you think about it, we’ve always known how true it is. It’s tongue-in-cheek but it’s necessary in order to convince all these AI-washing fuckheads what a gimmick it is to really be making sweeping statements about a chatbot that still can’t spell lollipop backwards.

If you got Apple is “destroying creativity” from that, you A) only saw the clip or B) are searching for problem.

The outcry’s not really Apple, but tech in general. The backlash of crushing the human experience is the transition from valuing true art and creativity, and just lurching toward yet another do-everything screen that doesn’t compliment creativity, but instead displaces it, with the hint of incoming generative AI.

Apple really doesn’t give a fuck about art, creativity, expression, or for that matter quality anymore. They’re good at making a thing that sells, they’re good at marketing it, and they’re good at convincing people of the cost vs worth equation that gives them insane margins over their chic branding. I love the outcry not because of any validity behind the detriment of tablets and smartphones (which is absolutely there) but moreso because it’s entertaining when a company renowned for their advertising prowess fucks up so publicly then backpedals with apologies.

Good times, and fuck Apple.

That’s another “let them eat cake” moment. The stimulus checks covered a month’s groceries at best for most families.

I remember thinking movies just had absurd sensationalized plot lines and that our societies were past that shit. Then I saw former soviets killed by alpha particle emitting pills, whistleblowers dying, and now I’m thinking the truth is stranger than fiction.

Hang in there Edward Snowden, it’s amazing that fucker threaded the needle and still lives.

That new Chevy RST looking pretty badass, it isn’t just marginally better than a Cybertruck, it’s objectively superior in nearly every aspect that it can be superior.

So, if the AI generated tits look real, but they’re not HER tits, is it just less terrible?

A weeklong battery life, efficient cores, rapid response time, and great software environment make it a great choice…at 16GB for my needs. I will not recommend 8GB to any user at all going forward. It’s marketing malarkey with no future proofing, degrading the viable longevity of the machine.

There’s no conversation to continue. Glass is glass, and 8GB is 8GB, as well as being a joke.

15 tabs of Safari, which is demonstrably a better browser by some opinions due to its efficiency and available privacy configuration options. What if you prefer Chrome or Firefox?

I will argue in Apple’s defense that their stack includes very effective libraries that intrinsically made applications on Mac OS better in many regards, but 8GB is still 8GB, and an SoC isn’t upgradeable. Competition has far cheaper 16GB options, and Apple is back to looking like complete assholes again.

I’m actually deeply familiar with the architecture, and how caches, memory, and UM’s work. I understand all of that. None of that changes the storage available. Having high memory bandwidth to load/unload memory addresses doesn’t fix the issue of the environment easily exceeding 8GB. I also understand the caching principles and how you actually want RAM utilization to be higher for faster responsiveness. 8GB is still 8GB, and a joke.

You do have a point, but I think the intent of the article is to convey the common understanding that Apple is leaning on sales tactics to convince people of a thing that anyone with technical acumen sees through immediately. Regardless of how efficient Mach/Darwin is, it’s still apples to apples (pun intended) to understand how quickly 8GB fills up in 2024. For those who need a fully quantitative performance measurement between 8 and 16GB, with enough applications loaded to display the thrashing that starts happening, they’re not really the audience. THAT audience is busy reading about gardening tips, lifestyle, and celebrity gossip.

Looks like you didn’t read the article either.

Overall, I’m using 12.5GB of memory and the only application I have open is Chrome. Oh, and did I mention I’m typing this on a 16GB MacBook Air? I used to have an 8GB Apple silicon Air and to be frank it was a nightmare, constantly running out of memory just browsing the web.

Earlier it’s mentioned that they have 15 tabs open. I don’t like a lot of things they do in “gaming journalism” but on this article they’re spot on. Apple is full of shit in saying 8GB is enough by today’s standards. 8GB is a fuckin joke, and you can’t add any RAM later.

It’s not that it’s more efficient, it’s simply used less than in conventional PC architecture.

It’s not that you’re wrong from a philosophical perspective with that, it’s that you’re factually incorrect. Memory addresses don’t suddenly shrink or expand depending on where they exist on the bus or the CPU. Being on the SoC doesn’t magically make RAM used less by the OS and applications, as the mach kernel, Darwin, and various MacOS layers still address the same amount of memory as they would on traditional PC architecture.

Memory is memory, just like glass is glass, and glass will still scratch at a level 7 just like 8GB of RAM holds the same amount of information as…8GB of RAM.

The article actually quantitatively tests this too by pointing out their memory usage with Chrome and different numbers of tabs open.

Looks like you didn’t read the article.

  • VRR is finally in but experimental, and will need to refactor the explicit sync codebase when that’s merged in…so we’re 4 years away from usable?
  • Fractional scaling still experimental, after 4 years, so terrible experience for people with 4K displays
  • Still no UI menu to alphabetize apps automatically
  • Still no proper kstatusnotifier solution, so people will still install that extension
  • Still no auto-hide for Dash, back to using dash-to-dock
  • Breakage for extensions AGAIN

Garbage project by garbage devs, backed by a garbage corporate sponsor (IBM). Expect a garbage experience, thanks for nothing Guh-nome. Thank goodness for KDE and the Plasma project, which has UI solutions for all of the above and much MUCH more to make a cohesive well functioning DE with sane UX.

And I’m sure it was all the workers’ fault, none of the upper crust fuckers who wasted the money will see any penalty for it.

I already moved over to Proton in anticipation of Google charging for Gmail. It’s untouchable as far as longevity, but it’s going to be majorly enshittifiedto the point that leaving it will be more desirable than staying.

It’s great to see a service strategy that revolves around delivering value to your users, who then recommend it on your behalf, instead of revolving around delivering the least value while extracting the most revenue possible, so you can hand it to shareholders after executive compensation eats it up.

Way to go Proton! Easiest money I spend every year too.

I think across the economy, a lot of companies overbuilt and then when things went back to pretty close to exactly the way they were before. I think a lot of companies realized they’re not in a good financial place.

Zuckerberg is a miserable motherfucking shit cunt scumbag fuckface.

There’s a HUGE difference, and his statement is very false. Corporate profits were stellar when they shit canned people. Artificial demand reduction thanks to the Fed, which causes layoffs and pain for workers but NOT for corporations whose revenues are still at record highs. The fourth circle of Hell in Dante’s Inferno would be a kindness to executives, the eighth circle, Fraud…that would be where he belongs. Sometimes I hate being an Atheist, because I do wish nothing but misery and torment on the wealthy for all time. I can see where religion gets some serious legs, and during the French Revolution, that’s where they thought they were sending their aristocrats off to.

If only we could have the luxury of that belief too.

Giant rat penises will only hurt you if you have an underlying medical condition (anal fissures, etc).

Well said, and agreed. The headset market is a dead end for anything beyond niche. Their price point is ironically appropriate given the lack of mass appeal, so they have to go with a mind numbingly high ARPU to make that business unit work for them. Culturally such a product is a non starter, especially as people read about the isolating effects tech has had lately. The fever dream of the “Metaverse” is now mocked widely.

Things like the auto IPD are like a dream come true though, and then using the eye tracking as a mechanism for driving the UI/UX. That shit was fiction just ten years ago in Iron Man. I could just imagine how cool it would be to navigate the menus in a game like Elite Dangerous with something like that, but the engineering needed to develop that is a non starter for the lack of install base, not to mention the financial condition of companies that are not established like Apple is, app developers, etc.

I’m no fan of these layoffs, but a company like Docusign having over 7k employees is mind blowing to me. They could probably GET BY with 440 employees and then outsourcing customer service entirely.

Exactly, to the layperson they need an explanation about the AI-washing fad amongst others. It surprises me how many people don’t know about the Google graveyard. The Stadia launch is still fresh on my memory with how it was suppose to be the next major step forward.

When we hear about the next major step forward so many times…but as for this blog post, it’s almost as though they want to have the comments write the blog for them based on a superficial notion. Just like big tech.

All kidding aside, the passenger experience is a lot better anyway. Overhead storage bins on the newer airbus planes is a hell of a lot better, not to mention the infotainment systems that airlines seem to opt for. The way they integrate and function vs the Boeing dreamliners is a pretty stark contrast.

The old saying, “If it ain’t Boeing, I ain’t going”, it just needs slightly tweaked to be accurate today XD

There’s quite a disparity between what’s in the headline and what’s in the article. One really good point brought up was

Many technology leaders then struggle to keep track of what they have tasked project teams to accomplish or to hold them accountable for deliverables. In blame cultures or environments where difficult conversations are avoided, it’s often easier to let someone else go than admit to internal failings.

When you read more into what the “source” was for the article, it looks a whole lot more like an incompetent VP that’s trying to play a game to avoid accountability by lying to the people under him while his own leadership is asking what in the fucking hell is going on with their section of the company.

After reading through all that, it stinks a whole lot more of mismanagement and gross incompetence on the part of Amazon than any strategy especially if you apply Occam’s Razor. What’s more likely, that the company who misjudged and executed poorly with a string of other stupid decisions involved suddenly schemes up another “quiet firing” strategy, or that the underperforming leadership is trying to sweep something under the rug for their own accountability?

My bet would go toward the latter. I don’t see quiet firing, I see gross incompetence and fear.

I agree with you, AI is a thing alright, an overhyped chatbot thing. LLM’s are going to be neutered by pandering, and the true potential will be limited by investor fear and paranoia.

Fuck nVidia, I hope AMD’s MI3000 eats their AI lunch so they can go crawling back to the gaming table. Motherfuckers.

It takes an MBA to make shitty decisions, make 3,000% higher compensation than the average employee, and then turn around and layoff others because of your own shitty decisions. I think you’re being real kind with the assbag label there. I’m thinking a term like “parasitic shit-cunt” gets slightly closer but I just can’t think of anything derisive enough that satisfies the enmity I have for them.

Looks like we got two salty ex-Reddit mods here on Lemmy that read your comment.

From the article

Gelsinger explained how he thinks Nvidia and its CEO, Jensen Huang, just happened to be in the right place at the right time.

Well, no shit. That’s how ALL rich people got rich, luck.

“Unlucky cofounders” - the lowest net worth of any of them is 7 million. Ian’s not doing half bad considering most people will work their entire lives and not get a net worth even half of his.

The next one up has a net worth of 200 million.

Another dumbass fucking hunk of shit radio system doesn’t deliver on its promises? Oh my god, what a gigantic fucking surprise! People are gonna have a stroke when they see that shit!

Thermocline trust inversion, perfect example of why customer trust continues to erode and corporations continually lose credibility. Albeit Sony’s not the only bad actor here, it’s the overall agreements in place that were poor to begin with between businesses. The end result is a negative customer experience with all involved brands.

When the industry fails like this, we go back to incentivizing torrents.

That explains why so many product managers got laid off. They were targeting the source of the lawsuit.

There’s a few things I wonder about with this. They tested an AMD GPU, which is great as it shows off how awesome ACO and the RADV projects are. The Mesa devs alongside the AMD folks contributing to the RADV driver, along with all of Valve’s work, showcases how awesome FOSS can be at optimizing. I watched RADV go from underdog to top dog in performance, and it’s no longer arguable that AMD hardware works better under Linux than Windows thanks to the stellar work done on those projects.

Meanwhile we have nVidia who clings selfishly to their proprietary blobs, and I can’t help but wonder how great it could be if they opened that up and let the community in. Is it already hyper optimized to the point that the community wouldn’t be able to improve on it? Idk, but from an infrastructure standpoint, nVidia users would benefit from it immensely, if not from performance.

I recall when vkd3d-proton performance was severely impacted under nVidia, to the tune of a 40% delta, and that’s improved significantly, but I still wonder how this would look on an nVidia GPU to compare Windows vs Linux performance.