• 0 Posts
  • 19 Comments
Joined 1 year ago
cake
Cake day: June 27th, 2023

help-circle





  • Funnily enough, one of the few legitimately impactful non-enterprise uses of AVX512 I’m aware of is that it does a really good job of accelerating emulation of the Cell SPUs in RPCS3. But you’re absolutely right, those things are very funky and implementing their functions is by far the most difficult part of PS3 emulation.

    Luckily, I think most games either didn’t do much with them or left programming for them to middleware, so it would mostly be first- and second-party games that would need super-extensive customisation and testing. Sony could probably figure it out, if they were convinced there was sufficient demand and potential profit on the other side.


  • The Xbox 360 was based on the same weird, in-order PowerPC 970 derived CPU as the PS3, it just had three of them stuck together instead of one of them tied to seven weird Cell units. The TL;DR of how Xbox backwards compatibility has been achieved is that Microsoft’s whole approach with the Xbox has always been to create a PC-like environment which makes porting games to or from the Xbox simpler.

    The real star of the show here is the Windows NT kernel and DirectX. Microsoft’s core APIs have been designed to be portable and platform-agnostic since the beginning of the NT days (of course, that isn’t necessarily true of the rest of the Windows operating system we use on our PCs). Developers could still program their games mostly as though they were targeting a Windows PC using DirectX since all the same high-level APIs worked in basically the same way, just with less memory and some platform-specific optimisations to keep in mind (stuff like the 10MB of eDRAM, or that you could always assume three 3.2GHz in-order CPU cores with 2-way SMT).

    Xbox 360 games on the Xbox One seem to be run through something akin to Dolphin’s “Übershaders” - in this case, per-game optimised modifications of an entire Xenon GPU stack implemented in software running alongside the entire Xbox 360 operating environment in a hypervisor. This is aided by the integration of hardware-level support for certain texture and audio formats common in Xbox 360 games into the Xbox One’s CPU design, similarly to how Apple’s M-series SoCs integrate support for x86-style memory ordering to greatly accelerate Rosetta 2.

    Microsoft’s APIs for developers to target tend to be fairly platform-agnostic - see Windows CE, which could run on anything from ARM handhelds to the Hitachi SH-4 powered Sega Dreamcast. This enables developers who are mostly experienced in coding for x86 PCs running Windows to relatively easily start writing programs (or games) for other platforms using those APIs. This also has the beneficial side-effect of allowing Microsoft to, with their collective first-hand knowledge of those APIs, create compatibility layers on an x86 system that can run code targeted at a different platform.




  • Yeah, Windows’ bullshit is what drove me to Linux in the first place. I only have it on my gaming system, and only because Discord’s stupid screensharing doesn’t transmit audio on Linux, NVIDIA’s drivers for Linux suck balls (going AMD next time now that their cards are good again) and there are a couple of games my friends play that have issues on Linux. I’ve never run into a game on my everyday laptop that Linux couldn’t run, and the Steam Deck will take basically whatever you throw at it.

    Windows is a barely-functional rat’s nest of code spaghetti that falls apart at complete random. Sometimes your audio drivers will just stop working for no apparent reason. Sometimes your computer will just refuse to connect to the internet until you do a clean install. Windows Update apparently runs Prime95 in its spare time and so does the Antimalware Service Executable. I hate using it so much. I wish Windows would just curl up and die.



  • Did you read the article? Excerpts include:

    Generally, in business, it is sensible to provide your customers with what they want. With Twitter, the meme-makers’ favourite billionaire is doing the opposite. The cyber-trucker is trying his best to cull his customer base.

    Threads is what would happen if Twitter and Instagram made out in a bowling alley. It’s all their worst parts combined - but it may well succeed. Rocket-man Musk’s changes to Twitter have not exactly made it ‘brand friendly’. Threads, meanwhile, is shaping up to be a paradise for in-your-face brands - and the AdTech industry would love for you to join them

    and

    Threads’ naffness won’t stop its success. It’s data-scraping fluffily dressed up as substandard corporate twaddle. It’s a cringe-inducing privacy invasion. It’s not meant for users, but that doesn’t really matter: you’re not a user, you’re a product.

    It’s describing Threads as a product not for users, but advertisers. The perfect brand-friendly non-place for companies to stick their marketing crap. That doesn’t really come across as a ringing endorsement to me.




  • Killing two people who are both destined to die in the very near future to save one who will live for a considerably longer period and save a greater number of lives would be the right thing to do from a utilitarian standpoint. Tuvix, meanwhile, is a healthy being who was competent to discharge his duties and posed no threat.

    Consequently, while there is an argument to be made that killing one person to revive two others is a net benefit, the burden of suffering on that one person is extreme, and whether or not it is outweighed by the positive nature of the two others returning to live is very much a matter of individual outlook.

    It is also worth noting that Tuvok and Neelix as they existed before could be considered “already dead” as a result of their combination into a single entity. Thus you could argue that what actually happened is that Tuvix “died” so that clones of the deceased Tuvok and Neelix could be created from him. Admittedly this is a shaky argument given the whole “do transporters actually kill people in-universe” thing.


  • I’d be fine paying Google for YouTube Premium if I could use it without being logged in. I’d take an access key for anonymous ad-free viewing for $20 a month. But Google is never going to offer that because the data-harvesting is the whole point of YouTube to them. Google is a data-slurping company with an advertising division that dabbles in video, search and phones as side hustles.

    In any case, if they really do crack down on adblockers, there are always other methods of watching their videos ad-free, and if I really like a creator, I’ll subscribe to their Patreon or watch them on Nebula.


  • Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple’s “UltraFusion” and AMD’s “Infinity Fabric” to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.

    As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback… at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.


  • To be fair, a lot of these are accurate, or at least were at the time.

    • Multi-GPU just never caught on. There’s a reason you don’t see even the most hardcore gaming machines running SLI today.

    • The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.

    • Spore is largely forgotten, despite the enormous hype it had before release. It’s kind of the Avatar of video games.

    • It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn’t notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.

    • Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.

    • Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)

    • I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.

    • The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.

    • The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier… the iPhone family did make a huge impact in the long run, but it wasn’t until the 3GS that it was a true competitor to something like a Symbian device.

    The only entry on this list that’s really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck’s guts and has proudly never had a Facebook account.