Monkeys are a social construct. Like trees.
Monkeys are a social construct. Like trees.
Yes, everything that can be expressed as letters is in the Library of Babel. Finding anything meaningful in that library, though, is gonna take longer than just writing it yourself.
Also, the main problem with LIDAR is that it really doesn’t see any more than cameras do. It uses light, or near-visible light, so it basically gets blocked by the same things that a camera gets blocked by. When heavy fog easily fucks up both cameras and LIDAR at the same time, that’s not really redundancy.
The spinning lidar sensors mechanically remove occlusions like raindrops and dust, too. And one important thing with lidar is that it involves active emission of lasers so that it’s a two way operation, like driving with headlights, not just passive sensing, like driving with sunlight.
Waymo’s approach appears to differ in a few key ways:
There’s a school of thought that because many of these would need to be eliminated for true level 5 autonomous driving, Waymo is in danger of walking down a dead end that never gets them to the destination. But another take is that this is akin to scaffolding during construction, that serves an important function while building up the permanent stuff, but can be taken down afterward.
I suspect that the lidar/radar/ultrasonic/extra cameras will be more useful for training the models necessary to reduce reliance on human intervention and maybe reduce the number of sensors. Not just in the quantity of training data, but some filtering/screening function that can improve the quality of data fed into the training.
BYD was just a cell phone battery company, and was like “well we’ve got the lithium supply chain locked down, you know what needs huge batteries: guess we’re doing cars now.”
Waymo chose the more expensive but easier option, but it also limits their scope and scalability.
I don’t buy it. The lidar data is useful for training the vision models, so there’s plenty of reason to believe that Waymo can solve the vision issues faster than Tesla.
The thing is, if Intel doesn’t actually get 18A and beyond competitive, it might be on a death spiral towards bankruptcy as well. Yes, they’ve got a ton of cash on hand and several very profitable business lines, but that won’t last forever, and they need plans to turn profits in the future, too.
Compared to AMD FX series, the Intel Core and Core2 were so superior, it was hard to see how AMD could come back from that.
Yup, an advantage in this industry doesn’t last forever, and a lead in a particular generation doesn’t necessarily translate to the next paradigm.
Canon wants to challenge ASML and get back in the lithography game, with a tooling shift they’ve been working on for 10 years. The Japanese “startup” Rapidus wants to get into the foundry game by starting with 2nm, and they’ve got the backing of pretty much the entirety of the Japanese electronics industry.
TSMC is holding onto finFET a little bit longer than Samsung and Intel, as those two switch to gate all around FETs (GAAFETS). Which makes sense, because those two never got to the point where they could compete with TSMC on finFETs, so they’re eager to move onto the next thing a bit earlier while TSMC squeezes out the last bit of profit from their established advantage.
Nothing lasts forever, and the future is always uncertain. The past history of the semiconductor industry is a constant reminder of that.
I just mean does it keep offline copies of the most recently synced versions, when you’re not connected to the internet? And does it propagate local changes whenever you’re back online?
Dropbox does that seamlessly on Linux and Mac (I don’t have Windows). It’s not just transferring files to and from a place in the cloud, but a seamless sync of a local folder whenever you’re online, with access and use while you’re offline.
Intel got caught off guard by the rise of advanced packaging, where AMD’s chiplet design could actually compete with a single die (while having the advantage of being more resilient against defects, and thus higher yield).
Intel fell behind on manufacturing when finFETs became the standard. TSMC leapfrogged Intel (and Samsung fell behind) based on TSMC’s undisputed advantage at manufacturing finFETs.
Those are the two main areas where Intel gave up its lead, both on the design side and the manufacturing side. At least that’s my read of the situation.
Does it do offline sync?
iCloud doesn’t have Linux, Android, or Windows clients. It’s basically a non-starter for file sharing between users not on an Apple platform.
I don’t like the way Google Drive integrates into the OS file browsing on MacOS, and it doesn’t support Linux officially. Plus it does weird stuff with the Google Photos files, which count against your space but aren’t visible in the file system.
OneDrive doesn’t support Linux either.
I just wish Dropbox had a competitive pricing tier somewhere below their 2TB for $12/month. I’d 100% be using them at $5/month for like 250 GB.
So with the case/mobo/power supply at $259, the CPU/GPU at $329, you’ve got $11 left to work with to buy RAM and SSD, in order to be competitive with the base model Mac Mini.
That’s what I mean. If you’re gonna come close to competing with the entry level price of the Mac Mini (to say nothing of frequent sales/offers/coupons that Best Buy, Amazon, B&H, and Costco run), you’ll have to sacrifice and use a significantly lower-tier CPU. Maybe you’d rather have more RAM/storage and are OK with that lower performing CPU, and twice the power consumption (around 65W rather than 30W), but at that point you’re basically comparing a different machine.
Ok, let’s put together a mini PC with a ryzen 9700X for under $600. What case, power supply, motherboard, RAM, and SSD are we gonna get? How’s it compare on power, sound, form factor?
It’s an apples to oranges comparison, and at a certain point you’re comparing different things.
When I was last comparing laptops a few years back I was seriously leaning towards the Framework AMD. It was clearly a tradeoff between Apple’s displays, trackpad, lid hinges, CPU/GPU benchmarks, and battery life, versus much more built in memory and storage, a tall display form factor, and better Linux support. Price was kinda a wash, as I was just comparing what I could get for $1500 at the time. I ended up with an Apple again, in the end. I’m keeping an eye on progress with the Asahi project, though, and might switch OSes soon.
For the Mac Mini? The Apple Silicon line has always been a really good value for the CPU, compared to similar performance from Intel and AMD. The upcharge on RAM and storage basically made it break even somewhere around 1 or 2 upgrades, if you were looking for a comparable CPU/GPU.
For my purposes the M1 Mac Mini was cheaper than anything I was looking at for a low power/quiet home server, back in 2021, through some random Costco coupon for $80 off the base $599 configuration. A little more CPU than I needed, and a little less RAM than I would’ve preferred, but it was fine.
Plus having official Mac hardware allows me to run a Bluebubbles server and hack Backblaze pricing (unlimited data backup for any external storage you can hook up to a Mac), so that was a nice little bonus compared to running a Linux server.
On their laptops, they’re kinda cost competitive if you’re looking for high dpi laptop screens, and there’s just not really a good comparison for that CPU/GPU performance for power. If you don’t need or want those things then Macs aren’t a good value, but if you are looking for those things the other computer manufacturers aren’t going to be offering better value.
uh that was Siri’s fault
He’s a great guy, but sometimes a little hard to follow if you’re only taking part in one conversation at a time when he’s talking in two and listening to a third because he expects you to be on the ball in your own discussion when he jumps in to drop a tidbit or ask a question like a chess master playing 4 games in the park at once
If it’s like simultaneous chess, why isn’t the single thread sufficient context for everything that happens in that thread? It just sounds like the guy you’re describing has low cognitive empathy and doesn’t understand other people’s minds. At that point you’re just describing a neurodivergent person who may or may not be a genius in certain domains, while being a moron in this one domain that you’ve described.
Yeah, Netscape 4.0 was simply slower than IE 4.0. Back then, when a browser was a program that would actually push the limits of the hardware, that was a big deal.
Now that splash screen, with its pixelated gradient of the 256 color palette brings back some nostalgic memories.
It’s funny because we can see pixelated stuff today mostly in shitty jpeg artifacts, but those follow the jpeg algorithm for how to best conserve file size within their compression scheme, so they look different. This splash screen seemingly has every pixel meticulously chosen so that it’s in the right place, and working with only the limits of the color space.
Yes but who says that specific clade maps to the colloquial taxonomic word “monkey”?