Moving on fifteen years, StumbleUpon scratched that exact itch!
Moving on fifteen years, StumbleUpon scratched that exact itch!
I mean you’re not far wrong. I always dreamed of a Ford Sierra or a Vauxhall VX220, but the Scooby with McRae and Grist’s name on it was an absolute belter.
I suppose the last one is halfway true. In the UK before internet access was mainstream, you either had to use the school/work network connection and their weird access control packages, or use the local library. In any case, you actually had to get dressed to use the internet.
This was when ISDN was a fat pipe, and if you went to the library, had to plan what you was going to look up because you paid for 30mins of access time. After you’d searched for PS1 cheat codes, Ask(ed) Jeeves for a fact to settle an argument, and looked up pictures of the 555-branded Subaru Impreza, it was time to burn off whatever acces time was left on Lycos, Excite, or Google’s directory service to find new cool stuff.
Old school.
wow
very sale
much eat
so fruit
wow
…
much sad that Kabosu not here
The way it’s going, population centres will be nuclear waste soon.
I haven’t said anything so bleak in a while. Time for Rick Astley and kittens I think.
Awesome, I think I may go back to a language myself. Thanks for checking it out and letting us know!
Anecdotally, a friend who’s pretty handy at languages uses more Memrise than Duolingo now. Similar sort of setup, but with a different style of delivery - more visual cues and a better repetition approach.
Semi-offtopic: my life has gotten a little bit better since I used the Android (other OSes likely have similar functionality) option to block any witheld or blocked caller IDs.
Partly because it’s rarely anything good these days if it’s a witheld number, but mainly because my workplace’s outbound calls get automatically blocked so work calls don’t get displayed on my phone unless someone uses Skype or a work mobile. Winner.
9/10 would recommend A+++++ unless you’re on call, of course.
Awesome, thanks for the insight.
I’m showing my age here, but much like we had math coprocessors running beside the 286 and 386 gen CPUs to take on floating point operations; then graphics cards offloaded geometry-based math operations to GPU’s - are we looking at AI-style die or chips to specifically work on AI functions?
Excuse my oversimplification, this isn’t my field of expertise!
Wouldn’t this absolutely hammer the battery though, or at least give the CPU a hard time? My understanding is that offloading the work to a cloud platform means that the processor-intensive inputting, parsing, generating, and outputting operations are done in purpose-built datacentres, and end user devices just receive the prepared answer.
Wouldn’t this rinse the battery and increase the overall device temperature for “normal” end users?
Fair warning: I haven’t read the two papers outlined in the article.
Man I’ve seen derivatives of this stuff, but this must be the OG post - thanks for that!
I thought the whole Lunix thing came from the elite hacker JEFF.K!!!11 so I’m chuffed there’s another level to this!
“to compile the kernel you must kill me, Linus Romero”
There’s a safer but equally insane solution off some A-roads in the UK: https://maps.app.goo.gl/vsNMgsfYT7ooa39R9 .
I’m all for expanding cycle networks across major roads, but the wider motorist mindset isn’t quite ready for it yet, particularly on my old commuting routines like these arterial routes into London. I’d love to see more cycle lanes but not at the cost of more injured or dead cyclists or scooter users.
Yeah the charge got binned as internet access became more mainstream.it was inexpensive though, like £2 for half hour or something.
I’d pay a fair bit more to go back to an age when staring at this beautiful icon was all the reassurance you needed that the page was on its way: