Eating the crispy chicken while thinking of the people who burned alive
I can’t imagine how you think it’s incredibly simple. These things are hell to explain to pretty much any normal person who needs to know why there’s no picture on the monitor or why their laptop/phone is not charging, or why the keyboard isn’t working in BIOS (no USB 3 support so you gotta switch to a USB 2 port). Add to that the combinatorial complexity of different cables and hubs supporting different things, and no tools for troubleshooting what feature is missing (and where in the chain) or what is suboptimal.
Worse, sometimes it’s my boss who thinks they can cheap out and get a USBC dock instead of a proper dock, forcing me to run at non-native lower resolutions or unable to use a second screen.
Search patterns yes, but also location data, and it’s aggregated over all your friends. So if you go to a restaurant together with a friend who recently searched for some clothes brand, the algorithm will know that and show you ads for that brand. Chances are you talked about his interests when you met, so you incorrectly infer that it was listening to the conversation.
Critically, “Meta, Amazon, and Microsoft told 404 Media they have no involvement with CMG’s Voice Data tool.”
But more importantly, they can’t listen on your microphone unless you give them permission. It’s not a thing that is technically possible. And like the article says, these days phones even show an indicator to alert you when the microphone is on.
Yeah I’m not sure that war crimes work that way. You don’t get a pass because the opponent is doing illegal things.
The UI looks mostly like Firefox IMO. Critically, it’s just as slow as Firefox.
This has to be a joke
Unfortunately the license agreement for .Net forbids publishing any kind of benchmark results.
WASM? Are you talking about WebAssembly?
More like, the devs already knew but some middle manager promised they would remove it without understanding the ramifications, and now they’ve been schooled.
AI as a general concept probably will at some point. But LLMs have all but reached the end of the line and they’re not nearly smart enough.
that’s when they start filtering out the genuine content and show you mostly promoted stuff
Är det inte dåligt att medelklassens ekonomi är beroende på skyhöga bopriser? Det är försättning av 1980 talets tänke som ledde till finanskrisen. Hur kan vi klara bostadskriset om vi är för rädd att ens sänka bopriser?
Jo, jag tycker att det är ett sjukt och dysfunktionellt system när det inte går att betala av ett hus innan pensionen. Men situationen är som den är. Jag bemötte bara ditt påstående att inga vanliga människor har nytta av det. Min flickvän är i exakt den här situationen, vi vill flytta ihop men hon kan inte sälja sin bostad utan att bli kvar med en skuld. Så för min egen del är jag tacksam för räntesänkningen just nu, trots att vi är “vanliga människor”.
Att höja inflationen är exakt det som Riksbanken försöker åstadkomma. Kontroll av inflationen är i princip den enda orsaken till att Riksbanken ändrar räntan.
Ett sätt som det kan hjälpa vanliga folk är att många troligen sitter med fastigheter som de köpte när räntan var lägre och om de skulle sälja nu så måste de ta blanco-lån för att täcka förlusten. De sitter fast i boende som dessutom kan vara dyrare än de egentligen klarar pga ränteläget. Sänkt ränta hjälper dem ur den rävsaxen.
What a disgrace.
It’s crazy that so many people get so hung up on the size of the crowds. Like are you going to vote based on who gets the bigger crowd? What happened to voting based on the candidate’s policies and track record as a politician?
The catch is that they all need to run in the same transaction to be unaffected by other things going on in the database and to make updates atomic. A single transaction means a single connection, and ODBC/JDBC has no way of multiplexing or pipelining queries over a single connection.
It’s probably theoretically possible to run some things in different transactions. But with all the different layers and complexity of the code (including third party components and ORMs like Hibernate), understanding all the failure modes and possible concurrency issues becomes intractable.
AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.