I’m not talking about drivers for stuff that is not the cpu itself. But the processor itself usually contains a bunch of peripherals that need their own stuff.
I’m not talking about drivers for stuff that is not the cpu itself. But the processor itself usually contains a bunch of peripherals that need their own stuff.
it’s very possible to get linux to run on a processor without having implemented al functionality. You can just not support some onboard peripherals yet and have to do some things inefficiently in software. You don’t need good power management to simply be “running”, etc.
Getting linux to run is the first step, not the last. It’s the barest minimum you could do to have a product to sell. Running well, taking advantage of all hardware features properly is a whole different game.
do they come with all the necessary drivers? Or are they hoping they magically appear in the linux kernel after they’ve sold a bunch?
nah, this is just copium. Apple don’t release dev-kits to the general public. It was a real product, and it was a dud
i would like to filter out all “massive multiplyer online arena shooter” There are way too much of them
it would probably be some markov chain generator or something.
Also known as an llm
but if not, and they have two or more children ic tje amount gets split between them. Then you handlo those, and their children etc. It’s financial homeopathy
Any processor can run llms. The only issue is how fast, and how much ram it has access to. And you can trade the latter for disk space if you’re willing to sacrifice even more speed.
If it can add, it can run any model
but we aren’t any closer to agi than we were in the 50’s. $100 billion in revenue for openai won’t be any closer to agi either.
raytracing is insanely expensive. If you saw what current cards can render in real time, you would see a very very noisy, incomplete image that looks like shit. Without ai denoising and a lot of temporal shit (which only looks good in screenshots). It is very very very far from being able to render an actual frame with decent performance.
yes, but what you need to be doing is tons of multiply-accumulate, using a fuckton of memory bandwidth… Which a gpu is designed for. You won’t design anything much better with an fpga.
we use wired communications until the issue is solved.
it’s actually unknown. It looks like it, but it is not proven
i’d say it’s closer to 105
but not all of it is converted back to heat. The energy used to do actual work is used for that. Some of it comes back out as heat