• 2 Posts
  • 93 Comments
Joined 1Y ago
cake
Cake day: Aug 03, 2023

help-circle
rss

If only tech journalists bothered to do a superficial amount of research, instead of being spoon fed spin from tech bros with a profit motive…

This is outrageous! I mean the pure gall of suggesting journalists should be something other than part of a human centipede!



It’s beyond time to stop believing and parroting that whatever would make your source the most money is literally true without verifying any of it.


Isn’t yelp a pretty easily replaceable thing?

They built a reputation by being one of the first in the space, but they’ve squandered that reputation and I’m pretty sure someone else could start up a competing “reviews” product.

I’d like to have one that actually showed the history of things like restaurants, because if the head chef leaves and the reviews have gone to shit it turns out that the reviews since the new chef are much more relevant than the 1000+ 5 star reviews of the food of the old guy, and that isn’t discoverable anywhere on yelp or anything like yelp.

I’m not sure how you’d protect against enshittification long-term. But I think one of the things that has largely poisoned the spirit of the Internet in general is that everything is always about a “sustainable business model” and “scaling” before anyone even dreams of just writing something up and seeing if they can get it to go popular.


I get Dreamweaver vibes from AI generated code.

Same. AI seems like yet another attempt at RAD just like MS Access, Visual Basic, Dreamweaver, and even to some extent Salesforce, or ServiceNow. There are so many technologies that champion this…RoR, Django, Spring Boot…the list is basically forever.

To an extent, it’s more general purpose than those because it can be used with multiple languages or toolkits, but I find it not at all surprising that the first usage of gen AI in my company was to push out “POCs” (the vast majority of which never amounted to anything).

The same gravity applies to this tool as everything else in software…which is that prototyping is easy…integration is hard (unless the organization is well structured, which, well, almost none of them are), and software executives tend to confuse a POC with production code and want to push it out immediately, only to find out that it’s a Potemkin village underneath as they sometimes (or even often) were told the entire time.

So much of the software industry is “JUST GET THIS DONE FASTER DAMMIT!” from middle managers who still seem (despite decades of screaming this) to have developed no widespread means of determining either what they want to get done, or what it would take to get it done faster.

What we have been dealing with the entire time is people that hate to be dependent upon coders or other “nerds”, but need them in order to create products to accomplish their business objectives.

Middle managers still think creating software is algorithmic nerd shit that could be automated…solving the same problems over and over again. It’s largely been my experience that despite even Computer Science programs giving it that image, that the reality is modern coding is more akin to being a millwright. Most of the repetitive, algorithmic nerd shit was settled long ago and can be imported via modules. Imported modules are analogous to parts, and your job is to build or maintain the actual machine that produces the outcomes that are desired, making connecting parts to get the various components to interoperate as needed, repairing failing components, or spotting the shoddy welding between them that is making the current machine fail.


That’s it. Don’t respond to the points and the obvious contradictions in your bad arguments only explicable by your personal hard on for the tool, just keep shit posting through it instead.


Lol, it couldn’t determine the right amount of letters in the word strawberry using its training before. I’m not criticizing the training data. I’m criticizing a tool and its output.

It’s amusing to me that at first it’s “don’t blame the tool when it’s misused” and now it’s “the tool is smarter than any individual dev”. So which is it? Is it impossible to misuse this tool because it’s standing atop the shoulders of giants? Or is it something that has to be used with care and discretion and whose bad outputs can be blamed upon the individual coders who use it poorly?


Why are you typing so much in the first place?

Software development for me is not a term paper. I once encountered a piece of software in industry that was maintaining what would be a database in any sane piece of software using a hashmap and thousands of lines of code.

AI makes software like this easier to write without your eyes glazing over, but it’s been my career mission to stop people from writing this type of software in the first place.


They rather zone out and mindlessly click, copy/paste, etc. I’d rather analyze and break down the problem so I can solve it once and then move onto something more interesting to solve.

From what I’ve seen of AI code in my time using it, it often is an advanced form of copying and pasting. It frequently takes problems that could be better solved more efficiently with fewer lines of code or by generalizing the problem and does the (IMO evil) work of making the solution that used to require the most drudgery easy.


It’s not about it being counterproductive. It’s about correctness. If a tool produces a million lines of pure compilable gibberish unrelated to what you’re trying to do, from a pure lines of code perspective, that’d be a productive tool. But software development is more complicated than writing the most lines.

Now, I’m not saying that AI tools produce pure compilable gibberish, but they don’t reliably produce correct code either. So, they fall somewhere in the middle, and similarly to “driver assistance” technologies that half automate things but require constant supervision, it’s quite possible that the middle is the worst area for a tool to fall into.

Everywhere around AI tools there are asterisks about it not always producing correct results. The developer using the tool is ultimately responsible for the output of their own commits, but the tool itself shares in the blame because of its unreliable nature.


Some tools deserve blame. In the case of this, you’re supposed to use it to automate away certain things but that automation isn’t really reliable. If it has to be babysat to the extent that I certainly would argue that it does, then it deserves some blame for being a crappy tool.

If, for instance, getter and setter generating or refactor tools in IDEs routinely screwed up in the same ways, people would say that the tools were broken and that people shouldn’t use them. I don’t get how this is different just because of “AI”.


I’m pretty perfectionist about some things, but I honestly forget all of the time about this little crease in my phone. I thought I might give a shit before I bought a Motorola Razr last year, and now I often forget that it’s a foldable. Imagine if you will, a phone that actually fucking fits in your jean pockets…it’s worth the little (often invisible) crease.


It is fine to have casual knowledge of or a hunch about something, but far better to have the research and analysis to prove it.


You can’t spy on our citizens, that’s our (and our corporations’) job!

Signed, the US Government


I’d rather they just ban spy apps in general…but that’s a “dream a little dream, it’s never gonna happen” type of thing.




It’s funny how in this country, the public rhetoric is inundated with people decrying this type of behavior from governments as “big brother” and 1984, but as soon as you slap an inc on the back and make it Big Brother, Inc everyone’s completely fine with it.


Now they’re defiant? I thought the whole thing was that they were the silent majority or some bullshit.

I knew serial conman Trump couldn’t resist getting involved in crypto which is one of the biggest cons of all time.


I’m a millenial but not one of those people you’re describing, and I have actually paid my condo off.

The keys for me:

  • No kids
  • I job hopped in (what at least used to be) a high-paying field (tech)
  • I moved job markets from a low COL (cost of living) market to a high COL market
  • No student loan debt for me (my mommy and daddy paid for my tuition to a local state school 🫶 ), minimal student loan debt from my wife (~5k)…which I paid off after we got married
  • I don’t give a shit about cars…I drove used cars until I could comfortably buy a new one cash
  • We only have one car between the two of us
  • I moved rather than paying higher rents, and I often lived in really crappy apartments because they were cheaper (I do not recommend btb)

Healthy helping of luck involved, and definitely support from my parents by way of room and board until I was like 23, tuition, small car loan of ~8k after I graduated. However, I paid them back in full for the car, and I’m the only one of my siblings not to hit up Mommy and Daddy regularly like an ATM. I fucking hate debt with a passion (or even really temporarily owing someone else anything) and have basically never carried large amounts of it outside of when I had my mortgage for my condo.

(My neuroticism around debt is probably why I paid off a historically low rate mortgage…if I would’ve sunk that into the stock market or something instead of paying it off I probably would’ve made a fortune.)



imagine the downvotes coming from the same people that 20 years ago told me digital video would never match the artistry of film.

They’re right IMO. Practical effects still look and age better than (IMO very obvious) digital effects. Oh and digital deaging IMO looks like crap.

But, this will always remain an opinion battle anyway, because quantifying “artistry” is in and of itself a fool’s errand.



AI, allowing corporate bureaucracy and inefficiency to continue to grow unscathed.


🎶 Tell me lies, tell me sweet little lies 🎶



It’s like they made a bot out of the subreddit confidently incorrect.


This argument is so stupid it’s even remarkably stupider than the surrounding comments in a lemmy thread full of braindead bot humpers.

Congrats! 🎈




Meanwhile, I’m at my job trying to get an instance of a machine that can automatically SFTP somewhere as part of a script like it’s 1998 and I need a shell account from my dialup connection.



Having some free drugs at a party isn’t the same thing as “the first baggie being free”.

I would be very surprised if you could just walk up to drug dealers on the street and get free drugs like the urban / astroturf / DARE / LEO myth suggests.





I get that it’s probably technically possible to bypass, but it wouldn’t matter. In some cases, it’d actually be illegal to bypass and almost nobody would do it.

But hey, it’s not happened (yet) so this is purely speculation.


I just had a horrifying thought while wondering what these TV manufacturers will do with the TVs if you just never connect them to the Internet.

I know people are worried about them jumping on your neighbor’s WiFi without prompting, or even hacking your own connection, etc but I think that seems unlikely because of how deeply illegal it is…but my horrifying thought was that they could just create a new, free mesh network that only serves up ads.

You’d need a fucking faraday cage wrapped around your TV in order to not see ads.

Now if you’ll excuse me I’m gonna go throw up. 🤮



Passkeys seem like mtls…so much so that I’m not sure what the difference is.


Hold on honey, before we get our Wendy's I'll have to check the wsj for the historical prices on chicken nuggies first.
fedilink

CR (Consumer Reports) - How to eat less plastic (February 2024 edition)
fedilink