2) Surely, someone needs access to the account. How do you prevent those with access from using it? Security feels like turtles all the way down where you ultimately have to trust a few people to do the right thing.
The only reason someone would need access to the management account would be maintaining child accounts and IAM roles or reviewing logs, none of which should need root.
So long as the storage system is capable of serving a video stream without stuttering, that covers the 99% performance case for me. Anything beyond that is bulk transfers which are not time sensitive.
Without digging into the study, hasn’t this been true for many years now? Would love to know an actual pivot date where commercial scale deployments became the cheapest option.
I dug down trying to get the actual paper and it appears it's a pre-print with only the abstract available (but maybe I wasn't looking in the right spot).
That being said, the quotes from the author were more to the point of it being a milestone that in the UK that Solar+Battery systems were now less expensive than gas/coal.
To my understanding, this is a milestone vs "raw" production numbers, which you're correct in saying have had solar as the cheapest option for years.
The claim hinges entirely on the narrow definition of "large-scale energy generation." For the UK, with its high seasonal energy demand and low winter solar output, the cost of generation is almost irrelevant next to the cost of firming that power for 24/7/365 availability. While the paper[1] shows solar PV and daily-cycle batteries are getting cheaper, it also shows seasonal storage solutions like hydrogen are still an order of magnitude too expensive and inefficient (huge capex for electrolyzers/storage + poor round-trip efficiency). So providing reliable, 24/7/365 baseload power from PV + storage in the UK is demonstrably not cheaper than gas or nuclear today.
The per-kWh capacity cost at the link for hydrogen is very high compared to others I've seen. I wonder at the assumptions going into it. Are they assuming above-surface compressed hydrogen tanks, or liquid hydrogen?
Ultra cheap thermal storage promises cost at least an order of magnitude below that.
Holy crap, the UK has absolutely awful solar insolation, that solar and battery is now cheaper than coal really emphasizes how much the tech has advanced, and continues to advance!
Germany also has absolutely terrible solar resources, worse than any continental US state, and is also deploying tons and tons of solar.
Solar really is one of the more amazing technologies of our time, especially when combined with batteries, which advance almost as fast.
We will have such greater reliability, cost, and air quality as coal is completely replaced by modern clean energy systems.
It's kind of funny because you just know that in 30 years or so Texas is going to be a gloriously wealthy producer of solar power, and it'll probably be owned by the same people who own the oil fields today, but the current stranglehold of oil is preventing those very same people from getting richer.
Being owned by the people that have oil is pretty true. I know a number of Texans that still make good money on their wells which allowed them to accumulate a lot of land. They've been receptive to adding wind and solar farms on their properties, with wind being a pretty big winner around here.
You can definitely get richer selling cheaper energy. If the market demand price of energy is still high but you reduce your costs, you're making more money.
Germany has deployed a lot of solar because of the Stromeinspeisungsgesetz from the 90s. It was the first feed in tariff law in the world afaik.
It made it that you were guaranteed a certain rate for x amount of years. As a result people were driving to farmers to rent their roof to install solar on it.
There’s a weird thing the contestants in the solar car challenge that Australia has been hosting since forever where they generate more power on haze or partly cloudy days than on sunny. I don’t know if they ever sorted out why. Speculation was reflected light off the clouds, but I suspect panel temperature also played a role. And road temperature affects panel temps.
What the UK cannot do is concentrating solar. The efficiency absolutely crashes in diffuse light.
Yeah. The temperature issue would have been my first guess.
Regarding concentrating solar: are people still trying to make that work for commercial generation? I thought this had generally failed to pan out for electricity generation.
There are many variations of concentrating solar. There’s the tower, there’s the curved mirror with the tube of oil, and there’s the magnifying glass with a small, exotic solar cell that can handle 10 Suns’ worth of light.
None of them work on overcast days because they rely on parallel rays of light.
Only 10% of people who use computers have a job with any professional requirements. All of those expert tools are faking their usage statistics and market share research.
You're going to need to cite your statistics about the specific Windows-only professional requirements and how many people need them instead of continuing the snark chain.
> 15% of adults in the U.S. only use mobile devices to access the internet.
You're down to 85% already who even have the possibility of using your unicorn Windows-only software.
Yes, CNC machinists, mechanical engineers, and graphics designers exist. No, they're actually not the majority of the population. Also keep in mind this thread was talking about personal computers and not just work computers; just because some cashier's required to use a proprietary Windows XP program on their cash register doesn't mean they need to use Windows at home. Your argument is restricted to the small proportion of people who're either required or desire to do day job stuff on their personal PCs (of which not all of them actually need that highly specialized software you're referring to).
I guess if things get ugly they have some (terrible) plausible deniability? “What do you mean my $10 Windows Enterprise key is stolen? I thought that was the going rate. Linux is free”
Certainly seems the easiest solution. A lot of handwringing about poor database performance of UUID4 when you could use it exclusively for an external identifier all for the cost of an additional column.
Also, poor performance of UUIDv4 primary keys is most related to how write heavy your table is in the first place, and in particular how insertion heavy it is. In theory your users table shouldn't be very write-heavy, even if it may be insertion-heavy compared to other writes.
My framing device: if you had LLMs in the 1500s, how would that help Copernicus determine the orbits of the planets? Maybe through dumb chance, but creating a well reasoned model of the universe required new observations and the ability to interpret the data from a different point of view.
All the 1500s and earlier data that such an LLM would have to have been trained on would lead to an LLM that wouldn’t ever suggest a heliocentric solar system. That LLM might even say he was heretical or refuse to give an answer to anything that led to it saying that the earth wasn’t the centre of the universe. So no help at all.
Interesting framing. Although I assume all the observations had been done already. It was more about being bold enough to investigate a line of thought that wasn't obvious or popular at the time and proving it convincingly.
They already had many "explanations" and models for why the planets were seemingly moving back and forth in the sky during the year. Their models were more complicated than necessary simply because they didn't want to consider the different premise.
I believe the charitable interpretation is that it is not possible without breaking an enormous amount of legacy code. Which does feel close enough to “not possible”.
Some situations could be improved by allowing multiple library versions, but this would introduce new headaches elsewhere. I certainly do not want my program to have N copies of numpy, PyTorch, etc because some intermediate library claims to have just-so dependency tree.
What do you do today to resolve a dependency conflict when an intermediate library has a just-so dependency tree?
The charitable interpretation of this proposed feature is that it would handle this case exactly as well as the current situation, if the situation isn't improved by the feature.
This feature says nothing about the automatic installation of libraries.
This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.
In the situation you describe, there would have to be a dependency resolution, just like there is when installing the deps for a program today. It would be good enough for me if "first import wins".
> What do you do today to resolve a dependency conflict when an intermediate library has a just-so dependency tree?
When an installer resolves dependency conflicts, the project code isn't running. The installer is free to discover new constraints on the fly, and to backtrack. It is in effect all being done "statically", in the sense of being ahead of the time that any other system cares about it being complete and correct.
Python `import` statements on the other hand execute during the program's runtime, at arbitrary separation, with other code intervening.
> This feature says nothing about the automatic installation of libraries.
It doesn't have to. The runtime problems still occur.
I guess I'll have to reproduce the basic problem description from memory again. If you have modules A and B in your project that require conflicting versions of C, you need a way to load both at runtime. But the standard import mechanism already hard-codes the assumptions that i) imports are cached in a key-value store; ii) the values are singleton and client code absolutely may rely on this for correctness; iii) "C" is enough information for lookup. And the ecosystem is further built around the assumption that iv) this system is documented and stable and can be interacted with in many clever ways for metaprogramming. Changing any of this would be incredibly disruptive.
> This feature is absolutely not about supporting multiple simultaneous versions of a library at runtime.
> and support having multiple simultaneous versions of any Python library installed.
Which would really be the only reason for the feature. For the cases where a single version of the third-party code satisfies the entire codebase, the existing packaging mechanisms all work fine. (Plus they properly distinguish between import names and distribution names.)
> and support having multiple simultaneous versions of any Python library installed.
Installed. Not loaded.
The reason is to do away with virtual environments.
I just want to say `import numpy@2.3.x as np` in my code. If 2.3.2 is installed, it gets loaded as the singleton runtime library. If it's not installed, load the closest numpy available and print a warning to stderr. If a transient dependency in the runtime tree wants an incompatible numpy, tough luck, the best you get is a warning message on stderr.
You already have the A, B, C dependency resolution problem you describe today. And if it's not caught at the time of installing your dependencies, you see the failure at runtime.
You'd have to invent a different way, within existing Python syntax, to communicate the version, but you can do this today with sys.path and sys.meta_path hacks.
But virtual environments are quite simply not a big deal. Installed libraries can be hard-linked and maybe even symlinked between environments and this can be set up very quickly. A virtual environment is defined by the pyvenv.cfg marker file; you don't need to use or even have activation scripts, and you especially don't (generally) need a separate copy of pip for each one, even if you do use pip.
On the flip side, allowing multiple versions of a library in a virtual environment has very little effect on package resolution; it just allows success in cases of conflict, but normally there aren't conflicts (because you're typically making a separate environment for a single "root" package, and it's supposed to be possible to use that package in Python as it actually exists, without hacks). The installer still has to scrounge up metadata (and discover it recursively) and check constraints.
reply