return to table of content

The gigantic and unregulated power plants in the cloud

delroth
59 replies
1d

In the Netherlands alone, these solar panels generate a power output equivalent to at least 25 medium sized nuclear power plants.

Since this didn't pass the smell test: the author is looking at nameplate capacity, which is a completely useless metric for variable electricity production sources (a solar panel in my sunless basement has the same nameplate capacity as the same panel installed in the Sahara desert).

Looking at actual yearly energy generation data, this is more like 1.5 times the generation of an average nuclear power plant (NL solar production in 2023: 21TWh, US nuclear production in 2021: 778TWh by 54 plants).

Which maybe puts more into perspective the actual risks involved here. I'm not saying there shouldn't be more regulations and significantly better security practices, but otoh you could likely drive a big truck into the right power poles and cause a similar sized outage.

1053r
38 replies
1d

For the purposes of information security, the nameplate capacity is the correct number to consider for a very simple reason: we must defend as if hackers will pick the absolute worst moment to attack the grid. That is the moment when the sun is shining and it's absolutely cloudless across Netherlands, California, Germany, or wherever their target grid is.

At that moment, the attacker will not only blast the grid with the full output of the solar panels, but they will also put any attached batteries into full discharge mode as well, bypassing any safeties built into the firmware with new firmware. We must consider the worst case, which is that the attacker is trying to not only physically break the inverters, but the batteries, solar panels, blow fuses, and burn out substations. (Consider that if the inverters burn out and start fires, that's a feature for the attacker rather than a bug!)

So yes, not only is it 25 medium sized nuclear power plants, it's probably much higher than that! And worse, that number is growing exponentially with each year of the renewable transition.

This was probably the scariest security expose in a long time. It's much much worse than some zero-day for iphones.

A bad iPhone bug might kill a few people who can't call emergency services, and cause a couple billion of diffuse economic damage across the world. This set of bugs might kill tens of thousands by blowing up substations and causing outages at thousands to millions of homes, businesses, and factories during a heat wave. And the economic damage will not only be much higher, it will be concentrated.

idiotsecant
14 replies
1d

This is wildly overstating the issue. Hackers are not going to break into hundreds of separate sites, compromise inverters, compromise relay protection, compromise SCADA systems, and execute a perfectly timed attack. Even if they did, these are distributed resources, they don't all go through a single substation and I doubt any one site could cause any major harm to any one substation.

Instead, they're going to get a few guys with guns and shoot some step of transformers and drive away.

The problem with infosec people is they tend to wildly overestimate cyber attack potential and wildly underestimate the equivalent of the 5 dollar wrench attack.

1053r
5 replies
23h43m

This isn't hundreds of separate sites that have to be hacked individually. This is fewer than 10 clouds with no security to speak of and the ability to push evil firmware to millions of inverters worldwide, where in a few years at the current rate of manufacturing growth, it will be 10s, and then 100s of millions of inverters.

Yeah, the potato cannon filled with aluminum chaff or medium caliber semi-automatic rifle can take down a substation. But this is millions of homes and businesses, which can all have an evil firmware that triggers within seconds of each other. (There will inevitably be some internal clocks that are off by days/months/years, so it's not like it will happen without warning, but noticing the warning might be difficult.)

And the growth in sales is exponential!

ethbr1
4 replies
23h29m

medium caliber semi-automatic rifle

Technically, anything that can put a hole in an oil-filled transformer. https://en.m.wikipedia.org/wiki/Transformer_types#Liquid-coo...

You don't need to break it... just crack the radiator enough for all the circulating fluid to drain, then it overheats.

applied_heat
1 replies
15h30m

Any transformer over about 5 MVA will probably be equipped with a low oil level switch that de-energizes it

jpc0
0 replies
2h53m

If all you wantes was to kill the power I don't see the difference...

Sure the repair is easier/quicker but the economic damage was already done...

ethbr1
0 replies
20h14m

Also in the north GA mountains in the 1970s.

Gud
4 replies
23h44m

Most(more or less all of them) grid operators can operate their network remotely from a single control room.

I suspect most grids are extremely easy to hack(never tried, don't bite the hand that feed you etc).

Info sec is just a hobby of mine. I install high voltage switch gear for a living.

TwiztidK
2 replies
22h25m

I suspect most grids are extremely easy to hack

I’d expect the opposite. All companies controlling equipment that is part of the “Bulk Electric System” have to be NERC CIP compliant and are audited regularly with large fines for non-compliance. Doesn't guarantee perfect (or even good security) but it’s more likely to be a priority.

ufocia
1 replies
15h52m

How do fines make things better? They confiscate resources that could be used to improve.

applied_heat
0 replies
15h39m

The management at the utility doesn’t want to be recognized for being a deficient operator that doesn’t meet standards, so they hire employees to ensure they are compliant

A fine is a black eye for a utility where people pride themselves on the reliability of the service they provide

applied_heat
0 replies
15h43m

A lot of utilities have their own fibre since they own poles/towers and need it for tele protection anyway so they can have secure a real private network between control room and significant power plants

lll-o-lll
1 replies
18h36m

Hurray! I have experience that may shed some insights. I worked on SCADA software (3 different ones), for about 15 years, started off as a Systems Engineer for an Industrial Power Metering company (but writing software), built drivers for various circuit breakers and other power protection devices, and wrote drivers and other software for IEC61850 (substation modelling and connectivity standard). I’ve been the technical director of one of these SCADA systems, and in charge of bringing the security to “zero trust”. I’ve been on the phone with the FBI (despite not being an American or in America), and these days I design and lead the security development at a large software company.

I’ve been out of the Power Industry/SCADA game for about 6 years now, and never had huge involvement with solar farms, so please take this with a large grain of salt, but here is my take. 15 years ago, all anyone would say about industrial networks was “air gap!”. Security within SCADA products was designed solely to prevent bad operators from doing bad things. Security on devices was essentially non-existent, and firmware could often be updated via the same connectivity that the SCADA system had to the devices (although SCADA rarely supported this; it was still possible). In addition, SCADA systems completely trusted communication coming back from the devices themselves, making it relatively simple for a rogue device to exploit a buffer overrun in the SCADA. After Stuxnet + a significant push from the US government, SCADA systems moved from “defensive boundary, trust internally” to “zero trust”. However, devices have a long, long service life. Typically they would be deployed and left alone for 10+ years, and generally had little to no security. Security researches left this space alone, because the cost of entry was too high, but anytime they did investigate a device, there were always trivial exploits.

Although SCADA (and other industrial control software), will be run on an isolated network, it will still be bridged in multiple places. This is in order to get data out of the system, but also to get data into the system (via devices, and off-site optimisation software). The other trend that happened over time was to centralize operations in order to have fewer operators controlling multiple sites. That means that compromising one network gives you access to a lot of real world hardware.

Engineers never trusted SCADA (wisely), and all of these systems would be well built with multiple fail-safes and redundancies and so on. However, if I were to be a state-actor, I’d target the SCADA. If you compromise that system, you have direct access to all devices and can potentially modify the firmware to do whatever you want. If there is security, the SCADA will be authorized.

I don’t think the security risks are overblown (they are overblown in what they think the real problems are). I think that as the systems have gotten ever more complex; we have such complicated interdependencies that it is impossible to deterministically analyse them completely. The “Northeast blackout of 2003” (where a SCADA bug lead to a cascade failure), was used as a case-study in this industry for many years, but if anything, I think the potential for intentional destruction is much higher.

applied_heat
0 replies
15h33m

I’m in this space, but plc io networks from Schneider and Rockwell are still “trust internally”, and some HMI or scada has to have read/write to them. At least Rockwell you could specify what variables were externally writeable whereas Schneider was essentially DMA from the network.

g_p
0 replies
23h28m

They don't need to break into separate sites though - the issue at hand is that a single failure in the centralised "control plane" from the vendor (i.e. the API server that talks to consumers' apps) can be incredibly vulnerable.

Here's a recent example where a 512-bit RSA signing key was being used to sign JWTs, allowing a "master" JWT to be signed and minted, giving control of every system on that vendor's control system.

https://rya.nc/vpp-hack.html

tivert
5 replies
23h5m

Getting the grid back online is a laboreous manual process which will take (a lot of) time. Think...

It would be even more laborious and take more time to bring things back online if the attacker manages to damage or destroy equipment with an overload like the GP describes.

msandford
4 replies
21h20m

The "turning the grid up to 11" attack isn't really possible. I know it seems like it is, but the inverters will only advance frequency so much before they back off, the inverters will only increase voltage so much. Etc. Sounds scary, isn't practical.

Turning everything off when the panels are at peak output? That lets frequency sag enough that plants start tripping offline to protect themselves and the grid and it'll cascade across the continent in just a few minutes. Then you have a black start which might take months.

There's an excellent video on how catastrophic a black start is. https://youtu.be/uOSnQM1Zu4w?si=x0dA7X7-19CJm6Kf

kortilla
2 replies
11h45m

Months isn’t correct. Unless there was damage it could be recovered within a day.

msandford
1 replies
6h22m

Would love to know more about this. How would that happen? What's the process to bring it back up so fast?

The video has a lot of good info and seems compelling. During the Texas freeze many power company officials said the exact same thing, if the Texas grid went down it would have taken weeks to bring everything back online.

cesarb
0 replies
4h32m

What's the process to bring it back up so fast?

It's called black start (https://en.wikipedia.org/wiki/Black_start), and power companies plan for it, and the necessary components are regularly tested. It's not a fast process, it can take many hours to bring most of the grid back up. We had last year a large-scale blackout here in Brazil, and a area larger than Texas lost power; most of it was back in less than a day.

if the Texas grid went down it would have taken weeks to bring everything back online.

The trick word here is "everything". Every time there's a large-scale blackout, there's some small parts of the grid which fail to come back and need repairs. What actually matters is how long it takes for most of the grid to come back online.

tlb
0 replies
7h49m

Inverters may be protected against changing settings, but if you can replace the firmware it can likely cause permanent hardware damage. Which the manufacturer, perhaps under pressure from its government, can do.

t0mas88
5 replies
21h54m

The risk is not turning all solar installations "on maximum". That happens nearly every summer day between 1 and 2pm. Automatic shutoff when the grid voltage is rising can be disabled, but more than 9 out of 10 consumer solar installations in the Netherlands deliver their maximum output on such a day for most of the summer, not running into the maximum voltage protections.

The big risk is turning them all off at the same time, while under maximum load. That will cause a brown-out that no other power generator can pick up that quickly. If the grid frequency drops far enough big parts of the grid will disconnect and cause blackouts to industry or whole areas.

It will take a lot of time to recover from that situation. Especially if it's done to the neighbouring grids as well so they can't step in to pick up some of the load.

more_corn
4 replies
17h2m

Not if we have grid scale batteries. Solar shuts off, oh no. Sometime in the next four hours we need to get that fixed or something else up. Also flattens out the demand curve and allows arbitrage between the peak and valley.

ale42
3 replies
7h1m

Problem is, those batteries are not there (yet)...

huijzer
2 replies
5h24m

Don’t underestimate exponentials. Tesla produced 6.5 GWh of battery storage in 2022, 14.7 GWh in 2023, and will probably double again in 2024.

And other battery manufacturers such as BYD grow fast too.

automatic6131
1 replies
3h52m

Always underestimate exponentials: none exist in nature, they're just an early phase of an S curve (sigmoid, if you want the $10 word)

ale42
0 replies
1h36m

Which is kind of normal, we don't need infinite batteries ;-)

mschuster91
3 replies
23h27m

We must consider the worst case, which is that the attacker is trying to not only physically break the inverters, but the batteries, solar panels, blow fuses, and burn out substations.

Power transformers have a loooooooot of thermal wiggle room before they fail in such a way and usually have non-computerized triggers for associated breakers, and (at least if done to code, which is not a given I'll admit) so do inverters and every other part. If you try to burn them out, the fuses will fail physically before they'll be a fire hazard.

1053r
2 replies
23h2m

This is true, especially for low frequency (high mass) inverters. The inverters that are covered here are overwhelmingly high frequency (low mass) inverters. We hope that they practiced great electrical engineering and layered multiple layers of physical safeguards on top of the software based controls built into the firmware.

Of course a company that skimped to the point of total neglect on software security would never skimp anywhere else, right? Right?

:crossed-fingers: <- This is what we are relying on here.

And even if they did all the right things with their physical safety, the attackers can still brick the inverters with bad firmware and make them require a high skill firmware restore at a minimum and turn them into e-waste and require an re-install from a licensed electrician at a maximum.

mschuster91
0 replies
22h56m

Of course a company that skimped to the point of total neglect on software security would never skimp anywhere else, right? Right?

At least in Europe, product safety organizations and regulatory agencies have taken up work to identify issues with stuff violating electrical codes (e.g. [1] [2]) and getting it recalled/pulled off the market.

Sadly there is no equivalent on the software side - it's easy enough to verify if a product meets electrical codes, but almost impossible to check firmware even if you have the full source code.

[1] https://www.bundesnetzagentur.de/SharedDocs/Pressemitteilung...

[2] https://www.t-online.de/heim-garten/aktuelles/id_100212010/s...

blkhawk
0 replies
10h20m

high skill firmware restore at a minimum and turn them into e-waste and require an re-install from a licensed electrician at a maximum.

Well not even high skill - for "security" reasons and to prevent support issues as well as to skimp on testing needed informations are often only accessible to a chosen few.

Paradoxically the effect of thes "security" concerns often mean that there are plenty of easily exploited methods in devices like that. And the only people that have them are the ones that you need to worry about instead of some 16 year old teenager finding it and playing blinkenlights with his friends parents house causing trouble for him but getting the hard coded backdoor taken out after the media got wind of it.

If I was dictator of infrastructure I would ban any non-local two way communication and would mandate all small grid storage solutions run off a curve flattening model thats uniform and predictable. Basically they would store first and only be allowed to emit a fraction of their storage capacity to the grid afterwards. Maybe regulated by time of day.

hn_throwaway_99
2 replies
1d

While I agree that the important metric to consider is peak output and not average output, I would still guess that in a country like the Netherlands that peak output is nowhere near nameplate capacity.

Retric
1 replies
22h3m

You can get close to peak output just about anywhere, assuming the panels are angled rather than laying flat. You just can’t get it for very long in most locations.

mapt
0 replies
3h27m

The new method this past year that appears to be highly beneficial is to use various compass orientations of _vertically_ mounted panels. The solar cells got so cheap that every penny we spend on mounting hardware and rigid paneling now stings, and posts driven vertically into the ground which string cables tight between them are cheaper than triangles, way easier to maintain (especially in places with winter), and trade a lower peak (or even a bimodal peak) for a much wider production curve.

verisimi
0 replies
13h42m

Tldr; We can't talk about proper numbers cos hackers.

immibis
0 replies
7h28m

The "bad iPhone bug" scenario happened a few weeks ago, in the form of Crowdstrike. You underestimated the damages.

eldaisfish
5 replies
1d

you are splitting hairs about the wrong issue.

When it is sunny in the netherlands, it is likely sunny everywhere in NL because of how small the country is.

This is the situation where having so much solar power capacity (kW) is dangerous.

The risk scales with energy output but it would not term nameplate capacity a "completely useless metric".

hinkley
2 replies
1d

I dunno. I lived next to a small inland sea most of my adult life. The number of times someone on the other side of town asserted it was raining when in fact it was not was quite high.

Every adult in Seattle eventually has to learn that if you have an activity planned on the other side of town, if you cancel it because it’s raining at your house you’re not going to get anything done. You have to phone a friend or just show up and then decide if you’re going to cancel due to weather.

Now to be fair, in the case of Seattle, there’s a mountain that multiplies this effect north versus south. NL doesn’t have that, but if you look at the weather satellite at the time of my writing, there are long narrow strips of precip over England that are taller but much narrower than NL.

eldaisfish
0 replies
17h56m

clouds and rain do not behave the same as the sun.

What point are you trying to make here?

Aachen
0 replies
4h0m

"Sometimes it rained in a part of town only" does not disprove the person saying "it can be sunny virtually everywhere at the same time in a small country"

For a simple demonstration, https://www.buienradar.nl/nederland/zon-en-wolken/wolkenrada... has been showing cloudless hours pretty regularly in the last month. Someone meaning malice can certainly keep an eye on that for a few days to find a good moment

lucianbr
1 replies
23h26m

When it is sunny in the netherlands, it is likely sunny everywhere in NL because of how small the country is.

Often friends of mine who live in my city report rain when I see none, or no rain when it's raining outside my window. That's to say nothing of a location 30km away, where basically anything can happen. Do we live on the same planet?

Aachen
0 replies
3h58m

On which planet does the regular occurrence of one phenomenon disprove the regular occurrence of another?

It can both be true that weather is locally different on most days but coincides to be universally cloudless on a fair number of hours every late-summer month (easily within a reasonable waiting time for an attacker)

epistasis
4 replies
1d

You are talking about energy, which is not the same thing as power. TWh == energy, GW == power.

The distinction is important, especially in the Netherlands, which has a capacity factor of only about 10%-15%, whereas most of the US will be at least 20%-25%, which is twice as high.

I'm not sure of the typical number of reactors in the Netherlands, but using the US average of 1.6/power plant may not be the most representative comparison.

delroth
2 replies
1d

I have no idea what you're talking about, since nowhere did I use solar capacity factor data nor did I look at number of reactors per plant.

epistasis
1 replies
19h46m

You are using both with your energy generated numbers. That's where they come from.

Your solar TWh comes from 25GW at ~15% capacity factor, and to get your nuclear numbers you're looking at 1.6GW for each of nuclear "plants" when each reactor is usually about 1GW or less. There are ~90 reactors in the US, at 54 plants. The article is assuming 1 reactor per plant for the Netherlands.

ossyrial
0 replies
4h37m

The article is assuming 1 reactor per plant for the Netherlands.

Small addition that isn't mentioned in the English version of the article, but only in the original Dutch version: the article talks specifically about the Borssele power station [0] (which has a power output of 485MW).

[0]: https://en.wikipedia.org/wiki/Borssele_Nuclear_Power_Station

kkfx
0 replies
22h52m

The point is about instant power injected, not energy, the point is that keep an AC grid at the right frequency it's a tricky business because energy production and consumption must match.

Too much from production the frequency skyrocket, little production the frequency plunge.

Now classic grids are designed on large areas to average the load for big power plants, this way those plant see small instantaneous change in their output demand, let's say a 50MW power plant see 100-300kW instantaneous change, that's something they can handle quick enough. With massive p.v., eolic etc grid demand might change MUCH more for big power plant, like a 50MW P.P. need to scale back or power up of 10MW suddenly and that's way too much to sustain. When this happen if the demand is too much the frequency plunge, grid dispatcher operators have to cut off large areas to lower the demand (so called rolling blackouts), when the demand drop too quickly the frequency skyrocket and large PP can't scale back fast enough so they simply disconnect. Disconnecting the generation fall and the frequency stabilize, unfortunately most p.v. is grid tied, if a p.p. disconnect most p.v. inverters who have seen the frequency spike disconnect as well creating a cascading effect of quickly alternating too low and too high frequency causing vast area blackouts.

Long story short a potential attack is simply planting a command "at solar noon of 26 June stop injecting to the grid, keep not injecting till solar noon + 5'", with just "1 second or so" (due to eventual time sync issues) all inverters of a certain brand might stop injecting, making the generation fall, a bit of rolling blackouts and large pp compensate quickly. Than the 5' counter stop, all inverters restart injecting en-masse, while the large pp are full power as well, the frequency skyrocket, large pp disconnect causing most grid-tied inverter to follow them, there are large change an entire geographic segment of a grid fall. Interconnection operators in such little time do not know what to do and quickly the blackout might became even larger with almost all interconnection going down to protect active parts of the grid, causing more frequency instability and so more blackouts.

Such attack might led to some days without power.

cesarb
3 replies
1d

the author is looking at nameplate capacity, which is a completely useless metric for variable electricity production sources

For solar panels, the nameplate capacity is usually also the power generated at the peak production time, which is the moment when an attacker turning off all inverters at the same time would have the most impact.

That is: for an attack (or any other failure), the most important metric is not the total power produced, but the instantaneous power production, which is the amount which has to be absorbed by the "spinning reserve" of other power plants when one power plant suddenly goes offline.

wiredfool
1 replies
23h46m

No, the nameplate capacity is what a solar panel will produce under perfect lighting, independent of the site where it's installed.

The peak theoretical power output of a solar panel depends on where it's installed, inclination, temperature, elevation, and so on. The actual peak power is going to take weather and dirty panels into account.

1kw nameplate in Ireland (or the Netherlands) is never going to give you an instantaneous 1kw output -- you're going to be lucky to see 60% of that.

misiek08
0 replies
11h15m

But 60% of 25GW is less than 3GW? You need to take down more capacity than the buffer and power plants will disconnect from grid for fail-safe. Bringing grid up alone maybe will take days or few, but all the appliances out there will be down for weeks.. ClownStrike brought us pen-written boarding passes, glad we don’t install crapware on hospitals hardware

preisschild
0 replies
12h2m

No. You will definitely not get peak capacity even in the sahara. They got those numbers under perfect conditions in a laboratory, not under real circumstances.

hinkley
1 replies
1d

If memory serves, and I’ll admit it’s pretty fuzzy, the US tends to make ridiculously large nuclear reactors and Europe has an easier regulatory situation so they make more of them and smaller.

So in addition to the other stuff people mentioned, you might be off by another factor of 2 there. They also said “medium sized” so let’s call it 3.

mandevil
0 replies
15h24m

This might have been true back in the 1970s, but at least as far as current development goes, is not.

The only new (non-Russian) European design built in the past 15 years is the EPR at 1600 MW. The only new American design built in the past 15 years is the AP1000 which as the name suggests is 1000 MW (technically 1100). AP1000 uses a massively simplified design to try and be much safer than other designs (NRC calculations say something like an order of magnitude) but is not cost competitive against most other forms of power generation. Which is why after Vogtle 3 and 4 there are no plans for more of them in the US.

It's not that EPR is any better- they are actually doing worse in terms of money and time slippage than Vogtle did. Flamanville 3 had it's first concrete poured in 2007 and still hasn't generated a single net watt!

It turns out that the pause in building nuclear reactors in the west from about 1995-2005- both US (which actually was longer, from the early 1980's, after 3 Mile Island things still under construction were finished but nothing new was built) and Western Europe (after Chernobyl following a similar path) basically gutted the nuclear construction industries in both, and they haven't built back up. The Russians kept at it, and the South Koreans have moved in to the market (and China is building a huge number domestically, though I don't think they've built any internationally), but Western Europe and the US are far behind, and after Fukushima Daiishi I strongly suspect the Japanese are in the same boat. Without the trained workers you can't build these in any predictable way, and when you pause construction for a decade you lose all of the trained workers and it's really hard to build that workforce back up again.

verisimi
0 replies
13h38m

Not only that, solar is entirely misaligned with power requirements. Over the year it may be 1.5 over nuclear, but in the winter, when demand is highest, the amount of energy provided will be far less, on account of short days and low light - typically you get 1/10 of the energy in winter that you do in comparison to a summer day. So overproduction when unrequired, underproduction when required.

bramblerose
0 replies
23h38m

It's the power output that is relevant for the failure mode described in the article, not the yearly production. And in terms of power output, 20GW is an incredibly common number for peak solar production (see e.g. https://energieopwek.nl/ at the end of Jul this year) in summer. Borssele (the medium-sized power plant named in the article) has a 485MWe net output. So yes, we _are_ talking about >25 mid-sized nuclear power plants!

Aachen
0 replies
5h1m

(a solar panel in my sunless basement has the same nameplate capacity as the same panel installed in the Sahara desert).

Isn't latitude taken into account by grid operators for determining their expected peak output? The owners would otherwise be installing bigger (more expensive) converters than needed, so they'd know this value at least roughly. Even smarter would be to include the angle etc. but not sure what detail it goes into compared to latitude which is a very well-known and exceedingly easy to look up value for an area

I certainly see your point about it not being apples to apples, but on a cloudless summer day, the output afaik genuinely would be the stated figure (less degradation and capacity issues). The country is small enough that it's also not unlikely that we all have a cloudless day at the same time

One might well expect some sun in summer and put some of the used-in-winter gas works into maintenance or, in the future, count on summer production of hydrogen— although hacks are likely a transient issue so I wouldn't foresee significant problems there

yetihehe
47 replies
1d2h

The owner of the panels and inverters can meanwhile establish a connection with that manufacturer using an app or website, and via the manufacturer see how their own panels are doing

It wasn’t necessary from a technical standpoint to let everything run through the manufacturer’s servers, but it was chosen to do it this way.

(emphasis from article)

I'm working on IoT cloud system. It was chosen to be done this way because netither consumers nor installers have any expertise whatsoever to setup their own network or any devices to be acessible from outside (and they want their panels to be accessible when they are outside their home). I can do it, most readers of HN could do it, but typical consumer or installer can't. Sad but true.

lucianbr
12 replies
1d1h

What's the reasoning for not allowing both control paths, via cloud but also locally? So that people who can and want to, will use the local control.

yetihehe
6 replies
1d1h

Cheapness. It would require to be at least semi secure, application on phone would need to find those devices locally and it should be synchronized with cloud anyway, synchronization is error prone and we had problems with devices sometimes responding twice or very slowly through local interface (through cloud was much faster, no idea why, not our firmware). Also not enough people requesting that feature, most don't care and think that losing internet is not often enough to warrant worrying about this.

MostlyStable
4 replies
1d1h

Why not offer an either/or rather than both? Some people (I am one of them) actively do not want these kinds of things to be managed through the cloud servers. I don't want it to sync, I want to fully turn that off. I want to locally host, and I'm willing to take responsibility for that feature, including when it breaks. All I want is access to whatever the data reporting and control APIs are.

I get that I'm a tiny minority, and that very few customers want what I want. But A) it seems like giving me what I want should be very cheap (i.e should not entail ongoing customer support costs beyond normal, and in fact would get rid of the small cloud hosting cost) and B) I'd be willing to pay a premium to get it.

wmf
3 replies
1d

In some areas like cameras there are a decent number of cloud-free alternatives. Hopefully as the IOT market grows we'll get cloud-free versions of everything.

I think you're too optimistic about costs though. Providing any support at all, even one-time during the install, is expensive and cloud-free IOT is going to require support due to home networks being broken.

MostlyStable
2 replies
1d

Yes, support is expensive, but what I am proposing will, if anything, reduce support. I'm imagining something where, if I opt into local control, I am giving up all rights to any support that is not related to the core functionality of the device. For example the solar panels/inverters in the article. If I opt in to local control, then the only support I am entitled to is the solar panels stop generating power or if the inverter stops inverting. Anything that is network related is no longer the companies problem, because I have assumed complete responsibility for that. I'd even be willing to agree that, in the case that I ever decide I don't want local control, and I want to switch to the cloud hosting, that I will pay for the support required to switch me back over.

So if my home network breaks, that is not their problem. And they don't need to set it up, they just need to make it possible for me to set up, including figuring out how to make it work with my potentially broken home network. If it requires a new router because mine doesn't provide some necessary functionality? Not their problem. Etc. Etc.

yetihehe
0 replies
10h19m

So if my home network breaks, that is not their problem.

Differentiating between people like you, who can take blame for misconfiguring device and 99% of other consumers is not viable for most companies. Also, if you bought our device and wanted to do that, I would probably make a firmware version for you that connects to your endpoint and give you some docs. But:

- Just talking and coordinating that possibility for one user would cost my company more than the final price of device, when considering time spent on this.

- You would have to spend a lot of time to implement a lot of functionality to glue our protocol to your desired endpoint.

wmf
0 replies
1d

Consumer electronics doesn't work that way. If people can't get a product to work they will return it to the retailer and when the retailer gets a lot of returns they will penalize the company or drop them completely.

lucianbr
0 replies
23h19m

I have some shelly devices which manage to do all that, and cost next to nothing. Work with local rest services or cloud, password protection, TLS. Sure, it costs more than zero, but not much.

In the end, freedom goes away because we could not be arsed to ask for it at least, let alone fight.

michaelt
2 replies
1d

Often there are two control paths. Sometimes more! Plenty of inverters will quite happily give you an RS232 port specification and you can create your own dongle!

However, for purpose of the security of the nation's power grid, I don't just need my inverter to be secure, I need pretty much everyone's inverter to be secure. If an attack bricks 95% of solar inverters, the fact the nerdiest 5% of users have their inverters airgapped won't stop the grid having a lot of problems.

lucianbr
1 replies
23h23m

RS232 port specification and you can create your own dongle!

This is just a way of pretending to give access while making it as hard as possible. We are talking about a device that is already connected to the network. The local path is not some rest services, but a serial port for which I need to fabricate some hardware? Don't piss on me and tell me it's raining.

michaelt
0 replies
21h55m

Perhaps I wasn't clear - when I say "Sometimes more!" I mean many cheap chinese inverters actually support four options:

1. Cloud management with their app.

2. Wifi management without the cloud (when you're on your home wifi).

3. Unplug the wifi dongle from the inverter for a fully offline system. You don't really need your inverter on the internet anyway.

4. Unplug the wifi dongle and DIY whatever you want, the dongle's just a serial-to-wifi converter.

That's not to say the security of any of this stuff is good, of course. In fact the security is pretty bad! But you can for sure get inverters with multiple options for non-cloud operation.

toast0
1 replies
1d

The real answer is it's more than twice the work to have both paths, and there's not enough demand for it.

That said, Apple Homekit integration is local network based, so products that do that and the typical manufacturer cloud system have done both paths.

Homekit is a pain to use without Apple hardware/software, but there you go. (There's a plugin for HomeAssistant, but I'm still classifying that as a pain)

akira2501
0 replies
21h17m

I have a weather station.

It can connect to standard cloud weather service providers and I can view my data there.

I can also just redirect that exact same protocol to any other host or IP I specify.

They built it once and just gave me the ability to control WHERE that data goes. It's honestly not that hard.

pheatherlite
9 replies
1d1h

If we've learned anything from the security cam and baby cam scandals, then it's that convenience is king and we as a society would rather risk everything than be arsed to take few additional steps to setup/learn something to prevent such basic breaches. We (the society) don't even want to change the default password on most things.

titzer
4 replies
1d

People gonna be people. It's up to engineers and product designers to make things user friendly but also safe-by-default. If something needs to be configured, then provide instructions on how to configure it. Instead of pretending that it's society's fault (can't be arsed), maybe ask why the IT industry can't make instructions that are written out--explicit, fairly standard, and easy to follow--like the manual for putting together a piece of furniture. Or why the stupid device doesn't come with a randomly-generated strong password taped to it.

yetihehe
2 replies
10h10m

fairly standard, and easy to follow--like the manual for putting together a piece of furniture.

And people still have problems following that instructions. We gave a lot of instructions, but some people just don't read them. Or can't understand and follow them. Example from this week:

Yeah, main support guy for our client is on holidays this week and only a girl from sales is available and she doesn't know why our device doesn't work, she tried resetting her phone wifi but still can't pair our zigbee smart hub connected with usb modem stick (problem: no one told her she needs to message us to actually activate sim card before installing device for end client).

Yes, you can solve this one problem, but there are many more we didn't see yet. Consumer support does not scale and you can't write tests for something you don't know will be a problem.

titzer
1 replies
4h30m

no one told her she needs to message us

Did you have that written down somewhere? In instructions, which was my point.

Sorry for the snark, but I think "no one knows how to do anything" coupled with "Oh, the idiot didn't know to discombulate the canooter valve before inserting into the tinklerater--it's so obvious we didn't write it down" supports the point I'm making.

yetihehe
0 replies
3h29m

This was custom configuration process tailored specifically for that one client and it was written in a detailed instruction with each step tested and with screenshots. Apparently that instruction was not distributed internally at our client office. So mere existence of instructions is not often enough, you will still get very confusing calls about how to attach toaster to 36V to charge a wallet (when you did not send any toasters or wallets and client just wants your premade pizza).

Aachen
0 replies
4h39m

Instead of pretending that it's society's fault (can't be arsed), maybe ask why the IT industry can't make instructions that are written out

Because the competitor that doesn't have this installation "hassle"¹ will sell more units. It ends back up with society choosing to behave this way in aggregate

Society is obviously good (wouldn't want to live without it) and capitalism held up for hundreds of years now (not sure it's the best solution but with the tweaks that are in place in many well-faring countries it seems to work okay for them), but I do believe "society" does have a tendency to go for easy and cheap more than complicated and thorough when the need is not self-evident and has not been tangibly pointed out in living memory

¹ Example of hassle that HN users may not think of as exceedingly difficult: iOS setup. My dad asked me to help him set up his new tablet. I've fallen for Apple's motto and was annoyed he doesn't just do it himself. So we sit there and... lo and behold, it's good that he asked me. I forgot the details by now but things like what the heck screen time is, whether the Find My network is something to enable, whether to enable automatic updates, what to do about Siri, whether Apple Pay is something he needs to use banking apps, etc. are not things he'd have any idea of and I can give the necessary context from my IT background even if I don't have iOS myself. However, we still needed to contact their support because we both couldn't figure out how to get an Apple account set up. The continue button was disabled. I tried scrolling: nothing. Tapping the button: nothing. No error message, nothing. Turns out, the flow was only tested in portrait mode but my dad wants landscape mode easier for typing and so we never tried portrait. A part of the screen was in fact scrollable (I've used Apple devices before and my dad used the old tablet weekly, it's not like we're not familiar with their design patterns or don't know how to use a touchscreen) and revealed a required input field that would have been visible by default in portrait. Next, even setting up a standard Microsoft 365 email server in Apple Mail wasn't self-evident, iirc because he had never heard of the word Exchange and the oauth flow worked across multiple devices with again different bugs in Apple's own UIs (one needed landscape instead of portrait, another required some trick with zooming beyond intended bounds iirc)

"Hassle" is already perceived in setting up what people proclaim the most user-friendly of systems. Units that don't need it will absolutely sell better, even if that leads to a potential national energy supply risk

akira2501
2 replies
21h14m

and we as a society would rather risk everything than be arsed to take few additional steps

Large manufacturers would like you to think this. It would provide them a convenient excuse for not even trying to differentiate the market along these lines.

We (the society) don't even want to change the default password on most things.

Actually.. I just want to use my device _first_ and not go through some manufacturer controlled song and dance of dark patterns.

In my experience, if you don't pre load the user with this garbage, and then wait for them to have an actual _need_ that depends on the feature, they're FAR more compliant with following even lengthy instructions to get it done.

It's more a problem of aligned benefits and timing than anything else.

yetihehe
1 replies
10h7m

In my experience, if you don't pre load the user with this garbage, and then wait for them to have an actual _need_ that depends on the feature, they're FAR more compliant with following even lengthy instructions to get it done.

Nope, they want to have it working out of the box like with any other manufacturer out there. If you enable functions only after user wants it, they will comment "this sh*t doesn't work" on your app store page. Then you have to respons to each comment with "what doesn't work, could you specify please?" and then after several days that user has enabled the functionality, but in the mean time several another gave such comments.

akira2501
0 replies
8h18m

they will comment "this sh*t doesn't work" on your app store page.

What if I told you those 3% of people will say this no matter what you do. These comments and the reality of your product are entirely disconnected. We had a small userbase and added voice uploads into the app; unsurprisingly, about 3% of them are clearly impaired in some way when they leave a message. [x-files theme].

In any case, a simple "Fast / Slow Setup?" question to start is all you need, and a "Do More Setup?" after they finish one item has, again, in my experience, been entirely sufficient.

Reasonable people understand, "oh.. this needs the cloud.. and I didn't do that part yet.. so I'll go ahead and click the 'social media provider' button." If you also decide this is a good time to ask about a news letter, well, you got what you bargained for.

yetihehe
0 replies
1d1h

We (the society) don't even want to change the default password on most things.

Like you wouldn't believe.

My most memorable case of insecure IoT devices - wifi socket was sending wifi ssid and password of the network in cleartext in every ping packet to chinese servers.

Nextgrid
7 replies
1d2h

The cloud can operate as a dumb TURN relay relaying E2E-encrypted traffic. Then the worst the cloud can do is deny service to remote management (and even then, local management would still work), but it wouldn't be able to send direct control commands to the equipment since they don't have the authentication nor encryption keys.

This also makes it simpler from a programming point of view - instead of having separate cloud sync & local control protocols, you just have one local protocol and you merely tunnel it through the (dumb) cloud if you can't connect directly.

yetihehe
4 replies
1d1h

It could, but this requires to store historical data about usage on devices. If you store that encrypted data in cloud, then getting it to your mobile phone is super slow. If you store it in cloud, you can get historical data even if your device is dead or has 256 BYTES of memory and 1 megabit of flash storage. We have such devices, very effective at managing local municipal heating network and controlling several thermal controllers each via rs232 or rs485. Fortunately we preemptively moved everything into VPN'ed mobile network, we need special approval to touch anything on that network and can't connect without them granting access, so after EU started moving with cybersecurity this year, we are covered.

This also makes it simpler from a programming point of view - instead of having separate cloud sync & local control protocols, you just have one local protocol and you merely tunnel it through the (dumb) cloud if you can't connect directly.

Having only cloud protocol is even simpler, I've done all of the above (I do backend and our firmwares).

Nextgrid
3 replies
1d1h

we preemptively moved everything into VPN'ed mobile network

Unless your device itself is handling the VPN, I have bad news for you if you trust the mobile network to not open your devices up to malicious attackers: https://berthub.eu/articles/posts/5g-elephant-in-the-room/

yetihehe
2 replies
1d1h

We consider "they hacked the mobile network VPN's AND had time to reverse our protocol before being booted out of network" as too high a level to be resolved by us. If someone has enough resources to do this, he will also just hack into standard-level secured server at municipal office and there will probably be no one there to stop him or discover what went wrong.

adrianN
0 replies
1d

Do you at least fuzz your software?

DoctorOetker
0 replies
22h54m

reversing the protocol can be done in advance, if they order your product

wmf
0 replies
1d

I don't think E2E is simpler to program if you want to get it right. There are entire companies whose raison d'être is actually managing keys properly (e.g. Signal, Tailscale).

akira2501
0 replies
21h19m

This should be the basic model. A fully third party TURN service. You pay $20/mo to keep your home connected, and all devices and providers can use a standard protocol, and users remain fully in control of their data.

kkfx
3 replies
1d

Victron (to cite an NL vendor) actually can perfectly operate in LAN only via MQTT and ModBUS also offering a (bad) WebUI locally for settings pretty anything, including a display for the said WebUI in a framebuffer with an embedded mini-keyboard. It's up to the installer decide to go with their cloud offer or not.

The sole remark I have against them (beside the not so good software quality it's the impossibility for individual owners to do offline updates, we can upgrade via VRM portal but not downloading fw and flash it locally even if the needed device is on sale, because they offer fw files only to registered vendors.

Fronius (to remain in the EU) have a local WebUI witch need a connection only for fw updates, even if differently from Victron it's not a Debian based system with sources available but a closed source one, they unfortunately offer only a very limited REST API and a very slow ModBUS but still anything con be do locally.

I'm not sure, since I haven't any myself by SMA (Germany) and Enphase (USA) seems to been able to operate offline as well.

Stated that, yes, you are damn very right in saying most installers have no competence, thankfully where I live self-installation is allowed (at least so far), but that's simply demand better UIs and training for them perhaps avoiding the current state of the industry with an immense amount of CRAP at OEM level, with most "state of art" systems not at all designed to be used in good ways (see below) and absurdly high prices to the customer at a level it's not interesting installing p.v... 4 years ago I paid my system 11.500€ for 5kWp/8kWh LFP, the smallest offer to have it designed and built by someone else was ~30.000€ the most expensive ~50.000€ and all the 6 offers I tried shows some unpleasant issues and incompetence.

About OEMs just observe how ABSURD is that there is no damn DC-to-DC direct car charger. Most EVs now have 400V batteries, the same of stationary batteries, with equal BMS comms. Why the hell not sell an MPPT-to-CSS combo direct solution? Ok, we do not ONLY charge from the Sun, than it's perfectly possible have a compo charging station with DC for p.v. and AC for the grid, switching from one to another as needed. It's ~30% energy lost in double conversion.

Why no DC-to-DC high power appliance who still run DC internally (A/C, hot-water heat-pump heaters etc)?

Why not a modern standard protocol for integration of anything instead of building walled gardens?

Long story short OEMs have choose the cloud model partially because most installers are electricians able to use desktop holding the mouse with one hand and clicking with the other, but also because they have no intention to made user-interesting solution in an open market...

ansgri
2 replies
4h32m

I'm not a pro in these systems (yet, I hope), but my understanding is that, beside lack of demand, HVDC is a safety nightmare compared to AC, and inverters are getting more efficient each year. So, even given the choice, I'd keep AC home-wide distribution and set up inverters in key places, with exactly the highest required voltage.

kkfx
1 replies
2h4m

Well, yes DC is unsafe because if you get electrocuted instead of being "pushed out" you get "attached" to the conductors, but if you have a p.v. inverter at home you already have a 400V DC line from the module to the inverter, how different can be having the same to the garage? The same to an exterior heat-pump unit (where the DC compressor it's typically located) witch is also insulated by the mere fact you drive it via a remote?

Aside I'm not much an expert as well but, from DC to AC we are pretty efficient, around 98% of energy get converted. But the other way round it's pretty inefficient, meaning a loss around 30%. So while your p.v. it's pretty efficient in generating AC, your car it's pretty inefficient generating DC from AC to recharge. You heat pump would gain as well.

To be clear I do not advocate DC for any DC home appliance like washing machines or dishwashers, just for energy intensive appliances like big heat pumps and car chargers. Not more.

ansgri
0 replies
8m

Valid points, though for safety I thought less about shock and more about fire: the DC arc is much more stable, thus the switches, breakers and relays have to be more robust. The rated DC voltage of magnetic relays is about 1/10 of AC.

MathMonkeyMan
2 replies
1d1h

To be fair, I can do it only if I have time and physical access to the network. Home routers have different gateway IPs, different web interfaces, different password policies (e.g. there might be an admin password and an additional password for changing anything), etc.

It reminds me of <https://xkcd.com/627/>, but when you're launching a product that isn't good enough.

It's hard enough to open up a port even with uPNP (typically disabled) and other made-for-purpose tech. Torrent clients end up trying to poke holes and such. Service discovery might work via local UDP broadcast, or it might not. LAN clients might live at 10.* or 192.* or be isolated by default. It's easier to just go onto the public internet and contact some mysterious server. Botnet by design.

BlueTemplar
1 replies
11h13m

You mention IPv4. We're in 2024, this is getting ridiculous.

Governments should have done the same thing as with digital TV transition(s) : first ban selling devices that can't do IPv6, then ban selling (most) devices advertising they can do IPv4.

yetihehe
0 replies
10h0m

Here comes Matter protocol to the rescue, it supports IPv6 natively. It's even more complicated than Zigbee and of course doesn't specify all the devices available (but 1/4 of protocol specification is dedicated to smart fridge functionality because one fridge producer actually had someone to do any collaboration with protocol makers) and allows for "manufacturer specific fields" which means all manufacturers will have incompatible implementations of some fields anyway and you can't control them universally.

bee_rider
1 replies
1d1h

Also administering a bunch of IOT systems is a pain. If something is an open source community project, ok, I’ll play. If somebody is selling a product they are responsible for making sure it works.

Gibbon1
0 replies
1d

You could put an sql database on a local device and just access it remotely like anything else. But you are correct you're stuck with administering each and everyone one of them.

The standard go to a raz pi solution will up and die every few months. And half the time you'll need physical access to get it back. It takes a lot of work to develop an embedded system that has enough reliability.

USiBqidmOOkAqRb
1 replies
11h29m

they want their panels to be accessible when they are outside their home

I call bullshit. They've been conditioned to think that they want it, because all product brochures have it.

What kind of tangible benefit could there be to know how bright the sun is at your home while you're not there? A cool party trick to virtue signal or a break between doomscrolling, I suppose, but it's not like you're gonna jump up and drive back to... what even could you do if you knew?

yetihehe
0 replies
8h38m

I call bullshit. They've been conditioned to think that they want it, because all product brochures have it.

You gave reason WHY they want it. Maybe consumers were conditioned to want access, but they still want access. If you give them similar devices, they will chose the one which has application or webpage to see how their big investment is actually working. It's not about current state of device, it's all about historical data and month-by-month savings presented as a nice graph. They will check this maybe every week or month (later every several months), but buyers still want to know what their installation did for them.

pjc50
0 replies
7h45m

OK, so key question: why is there a control plane in there at all?

I can understand people wanting to be able to see the metering live, but remote control of the panels just seems like a security incident waiting to happen. I'm quite glad I have a non-internet-connected inverter.

oezi
0 replies
23h42m

The key failure is that despite the IPv6 transition we don't have static IPs at home and can start hosting servers at home.

Certainly this requires a lot of progress to secure the IOT space, but we can allow the enshitification of clouds to continue.

danielovichdk
0 replies
1d1h

These plants or farms are usually built around and on top of industrial IEC protocols and SCADA controllers which is a lot more low level than what any cloud IoT privider offers.

I have done a controller for a 40 foot container battery and it wasn't like we received any API from Hitachi (battery manufactor). We had to write everything ourselves.

crabmusket
0 replies
12h58m

I agree, and I wish it were otherwise. Why is it so difficult for me to have a home network where things can just work? Why is it a mess of configuration and self signed certificates? It seems like nobody is incentivised to provide this, because nobody providing me with devices, services, and so on lives in my house with me. They need my data and my control pathways to go via them, not to stay in my house.

bdamm
0 replies
1d2h

For IoT stuff in general; I can do it, and I don't want to because I'd rather spend my time doing other things (although yeah, I totally did learn everything I could about my solar array, because it is a source of power, after all. But for the other stuff...)

dataflow
12 replies
1d2h

It’s also possible that the manufacturer gets hacked, and subsequently sends out attacker controlled and wrong software updates to the inverters, with possibly dire consequences.

There are also people that claim that the many Chinese companies managing our power panels for us might intentionally want to harm us. Who knows.

Wait, seriously? The European power system relies on Chinese companies not messing it up remotely? And the debate is over whether the companies will stay nice? For heaven's sake, isn't it obvious that during a war the Chinese government can force them to just destroy the continent's power system remotely? How is this not seen as a extreme continental security risk?

amelius
6 replies
1d1h

They already can by simply turning open some power mosfets in their fleet of EVs.

bilbo0s
5 replies
1d1h

Yeah.

I'm not sure everyone is really thinking clearly here.

Don't get me wrong, they should get rid of this practice of cloud monitoring. A consumer should be able to access monitoring over the internet without an intermediary. They should, of course, be allowed to contract with an intermediary if that is their desire.

But the security argument?

Yeah, that ship has sailed. Total war, means total war. Your power grid, your internet, your communications, and your fossil fuel deliveries will all see material disruption. I wouldn't count on being able to stop those disruptions by banning a few web sites. (And frankly, during total war, those disruptions would be the least of your problems in any case.)

Best bet for places like Europe, China, the US and Russia is, just don't do total war with each other. If you choose to do it anyway, then you can see what you can expect from that in the documents filed under "Play stupid games, win stupid prizes."

crazygringo
4 replies
1d

You're turning war into a black-and-white "total war" situation. Total war is rare, and no -- no ships have sailed.

It's easy to imagine a scenario where something happens between China and Taiwan, Europe gets involved in a way that majorly pisses off China, and China decides to sabotage Europe's grid in response.

Nothing about that is "total war" with Europe, and it's not like Europe is going to escalate with nukes either because that would be wildly disproportionate.

But it's a major vulnerability that should be fixed as quickly as possible. It's negligent for that to even be an option for China, because it certainly doesn't seem like Europe can do anything similar to the grid in China.

Your idea that security vulnerabilities don't matter, that "that ship has sailed", is false and irresponsible.

bilbo0s
3 replies
22h29m

You've totally missed the point.

No one advocated ignoring the vulnerability. I, myself, specifically stated that monitoring should be direct. Consumers should unilaterally decide where, when and how their assets are monitored.

The material point on security is that there are many, many methods of disrupting a power grid. Even when you are looking for plausible deniability, shutting down solar panels from cloud website doesn't make a list of your top 10 options. (In fact, it won't make the list in those scenarios precisely because you are looking for plausible deniability.)

Let's imagine a power grid as modern societies know them today, except all consumers monitor their solar panels themselves, and none of those consumers outsource this monitoring function to any third party foreign or domestic. Power grids can still be materially disrupted in this scenario. Especially in the case of total war. Obviously in the case of open war. And particularly in the case of cold war.

As I said, I advocate consumers disconnecting any power generation functions from networks. But if I'm in the seat coming up with post conflict, or even simply emergency recovery, operating assumptions, I'm not counting on those panels generating power. It's just irresponsible to do so. In total war EMP will knock most of that generation off line where you're luck enough not to have it eliminated entirely. In cold or open war, disruptions to distribution can and will render that generation useless. (Just ask Ukraine.)

Consumer cloud, or even personal, monitoring of solar panels does not enhance, nor does it degrade, your adversary's ability to disrupt your power grid when your adversary is at that super power level. If you believe it does, you're either not looking at the full spectrum of what you're calling "vulnerabilities" extant in the infrastructure of modern societies. Or you're underestimating the full spectrum of capabilities of modern military powers. Both, frankly, are fatal mistakes in the types of crises we're postulating.

crazygringo
2 replies
21h35m

No, your point was clearly stated:

But the security argument? Yeah, that ship has sailed. Total war, means total war.

Those are your words.

I'm saying, focusing on total war is irresponsible and leads you to draw false conclusions. In the real world, limited conflicts are what we're dealing with 99.9+% of the time, thank goodness.

And now in your new comment, for some reason you're focusing on "plausible deniability" which is another red herring. If China wants to disrupt Europe's grid, it doesn't care about plausible deniability -- the entire point is to publicly retaliatiate. It just needs to do it, as easily as possible. The idea that relying on a cloud vulnerability "doesn't make a list of your top 10 options" doesn't make any sense at all. It might very well be the #1 option, or one of three tactics employed simultaneously.

bilbo0s
1 replies
17h47m

The security argument against cloud based monitoring has sailed.

With or without cloud based monitoring, our power grids can be disrupted.

That's the commonly accepted meaning of "that ship has sailed" as a colloquialism with respect to cloud based monitoring.

Also, you, yourself, brought up the idea of cold war style confrontation. The basis of most actions against proxy supporters in cold war style conflicts is plausible deniability. It's not a red herring, it's a widely adhered to tenet of cold war style conflict planning when targeting said proxy supporters.

I tried to cover total war, open war, and cold war to address the full spectrum of likely super power on super power active confrontations. In each scenario, the existence, or non-existence, of cloud based monitoring of solar panels, has no effect on the ability or inability of your adversary to disrupt your power grid.

Which disruption was the central thesis of your assertion. I was simply explaining why it was false.

You are being willfully argumentative at this point. If you didn't want to address cold war scenarios, why did you bring them up? You have a nice day sir or ma'am.

WA
0 replies
11h3m

How could a super power disrupt the energy grid in a non-total war scenario?

formerly_proven
1 replies
1d1h

Same continent that bought energy for decades from its strategic enemy. Coincidence? Probably not. Boundless naivete and corruption? Also yes.

ragebol
0 replies
1d1h

Russia wasn't an enemy for a while. The belief was that engaging with them would ensure they wouldn't be an enemy again. That failed.

Germany was an enemy once as well

shagie
0 replies
1d1h

It’s also possible that the manufacturer gets hacked, and subsequently sends out attacker controlled and wrong software updates to the inverters, with possibly dire consequences.

Idaho National Lab is one of those places that researches this. https://inl.gov - their domains are energy (primarily nuclear and integrated) and national security ... and securing the grid is the intersection of that.

And some time back... https://www.wired.com/story/how-30-lines-of-code-blew-up-27-... ( https://web.archive.org/web/20201101002448/https://www.wired... ) . The story is from 2020. The event is from 2007.

The test footage linked in the article is on YouTube - https://youtu.be/LM8kLaJ2NDU

The wikipedia article on the test: https://en.wikipedia.org/wiki/Aurora_Generator_Test

From the wired article the key part of how it broke:

A protective relay attached to that generator was designed to prevent it from connecting to the rest of the power system without first syncing to that exact rhythm: 60 hertz. But Assante’s hacker in Idaho Falls had just reprogrammed that safeguard device, flipping its logic on its head.

At 11:33 am and 23 seconds, the protective relay observed that the generator was perfectly synced. But then its corrupted brain did the opposite of what it was meant to do: It opened a circuit breaker to disconnect the machine.

When the generator was detached from the larger circuit of Idaho National Laboratory’s electrical grid and relieved of the burden of sharing its energy with that vast system, it instantly began to accelerate, spinning faster, like a pack of horses that had been let loose from its carriage. As soon as the protective relay observed that the generator’s rotation had sped up to be fully out of sync with the rest of the grid, its maliciously flipped logic immediately reconnected it to the grid’s machinery.
lifestyleguru
0 replies
1d1h

Another security issue are all these cheap always connected IP cameras from China. Meantime the most recent achievement of EU lawmakers is cap permanently attached to a bottle. No wonder, as at least in case of my country we are sending the most corrupted sleazy individuals to the EU parliament and commission.

afh1
0 replies
1d1h

Shutting off nuclear to rely on gas from Russia was not seen as an extreme continental security risk. This is nothing...

dathos
9 replies
1d2h

I live off-grid, power and water wise, and it really irked me that the monitoring coming with my inverter is only available online. Even when there is a network available the app will not work. I fixed this by getting a raspberry pi connected and reading it from there, but if I disconnect the inverter from the internet it will create a new network so now there is always an open network in the middle of nowhere with no option to disable it.

I'm thinking about screwing it open and desoldering the wifi module but honestly I'll replace it in the next couple of years so I'd rather not kill myself by making a mistake.

m463
3 replies
1d1h

why can't people just make stuff and sell it?

Spivak
1 replies
1d

Because humans are an ongoing cost and no one has figured how to sell non-consumable slowly depreciating goods as one-off purchases and keep paying your employees once you saturate your market.

Option 1: Artificially sell the thing as an ongoing cost.

Option 2: Artificially make the depreciation cycle faster. Get consumers to regularly replace it anyway with upgrades or trend changes.

Option 3: Make ongoing money from the item via a side-channel (tvs are great at this one)

Option 4: Manufacture and sell a huge number of different goods across market segments and weather the slow depreciation cycle (Oxo does this).

Option 5: Sell some consumable good you can get recurring revenue from along side the item (Coffee pods, printer ink)

Option 6: Make up the money on maintenance, repairs, and financing. Become a bank.

Option 7: Make your money in some other sustainable profitable business and drop the product once you've gotten what you can for it.

All of these kinda suck and option 1 is easy to implement.

callmeal
0 replies
9h59m

keep paying your employees once you saturate your market.

employees? they get fired once the market is saturated and demand flattens. ITYM shareholders.

When you look at it from that angle, another possible solution is:

- downsize manufacturing plant/capacity to make it match (or slightly exceed) replacement demand.

But that would mean a steady state profit margin (and not a cancerously growing one), so it will never fly.

lotsofpulp
0 replies
1d1h

In a developed country, there are lots of regulations and liabilities you are exposed to once you start selling something.

ansible
3 replies
1d1h

The high-voltage side should be separated from the electronics, so it shouldn't be dangerous if you are observant.

It may be sufficient to just disconnect the antennas from the WiFi module, that will help prevent any network connections.

Nextgrid
1 replies
1d1h

Disconnecting the antenna would still have leakage at close range. Grounding the antenna might be a better option. But in practice, the dangers highlighted by the article only surface when an attacker has control of many solar plants at scale.

Compromising an individual one by getting close-range physical access will be a local annoyance but wouldn't scale to a level where it can threaten the grid, so it limits the pool of potential attackers to local vandals (which can achieve their goals easier by just throwing rocks at your panels).

ijustlovemath
0 replies
20h24m

Without an antenna, even at close range, initial handshakes will fail or be unreliable.

serial_dev
0 replies
23h6m

Disclaimer, ymmw, if you have no clue about these systems (average people), you can still easily kill yourself in the process.

grecy
0 replies
1d1h

What inverter do you have? Many like the Fronius have a removable networking card.

ano-ther
9 replies
1d1h

Eye opening for me. One of the arguments for renewable energy (besides emissions) has always been its potential for decentralizing power generation. Makes it more resilient, democratizes the means of production etc.

This article shows that we inadvertently introduced new choke points. And of course the global security environment makes it more worrisome.

panki27
4 replies
1d1h

Hmm, almost like what happend to the internet... the idea being "everything is decentralized", but now +80% of traffic passes through Cloudflare and over 90% of mails come from 2 providers!

paxys
3 replies
1d1h

Cloudflare absolutely does not control 80% of internet traffic. I have no idea where you got that number from.

paxys
1 replies
23h59m

That's still the number of websites, not their traffic. A personal blog hosted on Cloudflare and google.com are not both the same.

Aachen
0 replies
3h53m

Probably that just skews it worse though. Most people aren't cf customers but many of the big targets seek protection

realusername
0 replies
1d1h

It never made any sense anyways, nothing can really escape the economy of scale, whatever the technology being used.

lupire
0 replies
1d1h

Solar is not the same as renewable.

Renewable and decentralized are different axes.

kkfx
0 replies
22h41m

Yes, p.v. have opened the way for semi-autonomy depending on where you live BUT ruling class really dislike this, they want slave not Citizens and tie people to service it's a very good way of making slaves who can't revolt.

That's why instead of pushing self consumption and semi-autonomous systems we push grid-tied and cloud-ties crap, to be tied to someone else service, slave of that. It's the "in 2030 you'll own nothing" already a reality in modern cars, connected to the OEM with a much higher access than the formal owner, much modern IoT and cloud+mobile crap. People do not even understand they do now own, until it's too late.

Another simple example: in most of the world banks between them have open standard to automatic exchange transaction, in EU that's OpenBank APIs, with signed XML and JSON feeds. There is NO REASON to block customers for directly use such APIs from a personal desktop client. All banks I know block such usage. So you do not have all your transactions signed by the bank on your iron, you have NOTHING in hand. In case of "serious issues" you have nothing to prove what you have on your bank, what you have done with your money. In the past we have had paper stuff to prove, we now have signed XML/JSON witch is even better than paper being much harder to falsify, but no, we miss because 99% must own nothing.

We have connected cars with a SIM inside, but instead of having the car offering APIs and a client or perhaps even a WebUI, directly to their formal owner we have to pass through their OEM, the real substantial owner. And we can't even disconnect the car. In the EU it's even illegal for new car to be disconnected since the emergency e-call service must be active on all new cars.

And so on.

foolfoolz
0 replies
13h56m

decentralized solar will never be able to provide power at scale. even the scale of 1 household. only homes with lots of land could afford the amount of panels needed. the average home will always need to consume power generated offsite

mikewarot
7 replies
23h0m

It irks me endlessly that we live in the worst timeline, where the computer equivalent of fuses and circuit breakers are almost completely unknown. Instead we trust code blindly.

This results in almost all of the situations threads here address.

In a better timeline, everyone has stable and secure OSs on all their devices, and the default is for everything to be locally networked, with optional monitoring from the outside via a data diode.

DoctorOetker
6 replies
22h40m

it's incredibly hard to implement a data diode for PV systems, enemy satellites can modulate light (like a TV remote, but lower baudrate to stay below the noise floor) and an inverter could decode it and respond accordingly.

They measure the PV panels anyway for MPPT.

Dylan16807
3 replies
14h20m

You're describing two very different concepts at the start.

A data diode applies to a specific connection. It's easy to have a serial port that goes one way.

Preventing any possible input to an already compromised device is much harder. But if your device isn't already compromised then it won't be looking at the input light levels for commands.

DoctorOetker
2 replies
10h39m

But if your device isn't already compromised then it won't be looking at the input light levels for commands.

But this is precisely part of the threat model, the manufacturers are best positioned to execute supply chain attacks on foreign buyers.

Dylan16807
1 replies
7h20m

That was not part of the threat model in the section of the post that mentioned data diodes.

And it's vastly harder to do an attack like that.

DoctorOetker
0 replies
6h21m

Its quite trivial really, what is needed to capture a weak signal with known modulation from the background is integration time. Think of how deep space light can be captured with digital cameras with "long" exposure times it can reveal light the human eye can't see because the integration time of light on the retina is too short.

Now other light sources will also integrate with time, this is where the modulation scheme comes in. First consider the amount of time you'd have to integrate the noisy signal to raise it above the noise floor. Thats the on time you need. How do we remove background light variations from other sources? Consider a discrete time pre-agreed pseudorandom sequence, that has "0" periods as often as "1" periods. To remove a constant background you take calculate the sum of light intensities of all "1" periods and the sum of all measured intensities of "0" periods. Then you subtract the "0"-sum from the "1"-sum, a constant signal will remove itself, the satellite signal will be summed N times. since your pseudorandom sequence was kept secret, random variations in light (think bird passing by) will not conspire to selectively block light during the "1" periods, so such noise will be uncorrelated with your pseudorandom signal. adding N uncorrelated noises grows by sqrt(N), so the S/N-ratio grows as sqrt(N). These are widely understood methods, an engineer might call it lock-in amplification, a physicist might call it correlation. This is very basic engineering / science knowledge. It's baffling that people consider this "hard" to execute, sure if you're the milk-man in a village this is hard to execute.

mikewarot
1 replies
6h23m

Sunlight is about 1000 watts/meter^2 on the ground. Do you have any idea how much power you'd have to send to get through a Jewish Space Laser(tm) to even have a -40 db s/(s+n) signal to pull out of the noise through a side-channel attack?

It would be interesting to use a solar panel as a sensor for moonbounce of an IR laser, though. You'd have to try at the almost new moon, with a window of a few hours/month ;-)

DoctorOetker
0 replies
4h34m

You send the instructions at night.

neilv
6 replies
1d

0.002 MW - Small set of technical standards, no diplomas or certificates required

Be careful with this language, especially when you're involving politicians and the non-technical.

The current atrocity of criminally negligent IT infrastructure right now is mostly created and driven by people with diplomas, including from the most prestigious schools. (And a top HN story over the weekend was one of the most famous tech company execs, turned government advisor, advising students at Stanford to behave unethically, and then get enough money to pay lawyers to make the consequences go away.)

And most of the certificates we do have are are individual certifications that are largely nonsense vendor training and lock-in, and these same people are then assembling and operating systems from the criminally negligent vendors. And our IT practices certifications are largely inadequate compliance theatre, to let people off the hook for actual sufficient competence.

My best guess for how to start to fix this is to hold companies accountable. For example, CrowdStrike (not the worst offender, but recent example): treat it as negligence, hold them liable for all costs, which I'd guess might destroy the stock, and make C-suite and upper parts of the org chart fear prison time as a very serious investigation proceeds. I'd guess seeing that the game has changed would start to align investors and executives at other companies. What could follow next (with growing pains) is a big shakeup of the rest of the org chart and practices -- as companies figure out that they have to kill off all the culture of job-hopping, resume-driven-development, Leetcode fratbro culture, IT vendor shop fiefdoms, etc. I'd guess some companies will be wiped out as they flail around, since they'll still have too many people wired to play the old game, who will see no career option other than to try to fake it till they make it at the new, responsible game (ironically, and self-defeatingly, taking the company down with them).

WalterBright
2 replies
1d

Punishment is not the answer, you'll just drive out of the industry lots of competent people. Punishment also means that nobody will admit to mistakes, will not fix mistakes (because that implies guilt), and the covering up of mistakes.

Punishment for mistakes is what led to the Chernobyl disaster.

neilv
1 replies
1d

Flight safety works so well because the personnel are aligned with safety and professionalism, and the FAA has an important program in place to protect people from being punished for behaving professionally. And IIRC you're familiar with aircraft manufacturer alignment with safety.

But I'm concerned about the entire field of software, which doesn't have that sense of responsibility, and I don't see how it would get it. However, software industry -- both companies and workers -- are guided almost entirely by money. To the point that it's often hard to explain to many people in HN discussions on why it would be good to behave in any other way than complete mercenary self interest. So I don't see any way to get alignment other than to link money to it. If people see that as punishment, so be it.

WalterBright
0 replies
20h39m

Every business is guided almost entirely by money. The purpose of a business is to make money.

Organizations based on other incentives don't work, or work very poorly.

hinkley
1 replies
1d

Put another way: it’s far too easy and common for certification to encourage rote memorization. And only rote memorization. No higher order reasoning is imparted.

Knowledge without reasoning is how you get mired in bureaucracy.

neilv
0 replies
1d

I think the larger problem is alignment.

BS gatekeeping rituals and compliance-for-sale theatre are arguably just symptoms -- of companies and individuals not being aligned with developing trustworthy systems.

pas
0 replies
22h10m

in your later comment you mention alignment, but the reason is that there's an enormous market discontinuity between doing the "super-duper right thing" and doing the profitable thing ... due to network effect(s).

we see competition in cloud/IaaS providers because they actually need to build datacenters and networks and so there's some price floor, but when it comes to "antivirus" CrowdStrike was able to corner the market basically, and downstream from them not a lot of organizations/clients/costumers can justify having actual independent hot-spare backups (or having special procedures for updating CS signatures by only allowing it to phone home on a test env first)

the cultural symptoms you describe in so much detail are basically the froth (the economic inefficiencies afforded) on top of all the actual economic activity that's sloshing around various cost-benefit optimum points.

and it's very hard to move away from this, because in general IT is standardized enough that any business that needs some kind of IT-as-a-service will be basically forced to pick based on cost, and will basically pick whatever others in their sector pick -- and even if there are multiple providers the will usually converge on the same technology (because it's software) -- thus this minimizes the financial risk for clients/customers/downstream, even if the actual global/systemic risk increases.

WalterBright
6 replies
1d

It’s also possible to install new software (firmware) on the inverters via the manufacturer, either automatically or manually.

As always, the vulnerability of enabling remote updates. When will people learn? Updates should only be possible if there's a physical switch (not a software switch) on the device. If it's "off", no updates are possible.

Isn't the most devastating attack vector remotely installing malware? With a hardware switch, none of that malware will survive a reboot of the device.

I remember when hard disk drives came with a write-enable jumper. Then, once you've made a backup, the jumper is removed. Then it is impossible to accidentally or maliciously write over your precious backup.

DoctorOetker
5 replies
22h33m

That doesn't protect against supply chain attacks.

WalterBright
4 replies
20h58m

Neither does remote updating. But you'll still need physical access to the supply chain to compromise it, and that's not possible for some hacker in a basement.

DoctorOetker
2 replies
20h46m

I never claimed remote updating would prevent supply chain attacks.

I was responding to:

With a hardware switch, none of that malware will survive a reboot of the device.

A reboot of the inverter would not prevent a supply chain attack using MPPT measurement electronics for an optical backdoor channel.

WalterBright
1 replies
20h44m

So don't put the backdoor channel in without a physical switch.

DoctorOetker
0 replies
20h18m

Attackers don't ask permission.

The hardware backdoor channel is present anyway because MPPT needs it.

The software can abuse the measurements to listen for optically transmitted commands.

DoctorOetker
0 replies
20h4m

But you'll still need physical access to the supply chain to compromise it, and that's not possible for some hacker in a basement.

I forgot to respond to this sentence in the sibling response.

Supply chain attacks can be executed by intermediaries of the supply chain, or by manufacturers themselves: develop the capability to deny a foreign nation its energy infrastructure. The manufacturer is not a hacker in a basement. Manufacturers can be pressured by their local gorvernments, militaries, 3 letter agencies, ...

A precautionary principle would induce potential target nations to surreptitiously catalogue the inverter boards, sort them by most-GW serving type, and consider which control traces to cut to control the internal energy transfers in its inductors, capacitors, ... from a trusted parasite board. Just develop and test a few parasite boards for the most common inverters, and preferably have critical stock ready.

The main value in inverters is the power switches, inductors, capacitors, ... it would be cheaper to reroute the control to a trusted controller in the event of a calamity. We would survive fine, but it will be a painful few days.

WaitWaitWha
6 replies
1d1h

Q: Are there no regulatory requirements for power plants of any kinds in EU, specially around cybersecurity?

I do not allow any system into my environments (at home and at work) that requires a third party data connection function.

There are way too many incidents where a provider, cloud or otherwise which required connection failed for various reasons.

(e.g., Cisco Spark Board, Xerox ConnectKey, Google Cloud Print, WeWork's Connected devices, Lattice Egnines, MS Groove Music Pass, Shyp, Adobe Business Catalyst, Samsara, Zune, FuelBand, Anki Vector Robot, Google Stadia, Pebble)

Despite this, I am very leery of regulating solar power specifically.

numpad0
2 replies
1d

How would one practically verify and certify cybersecurity of a product? Even payment smartcards sometimes come with non-malicious maintenance backdoors. There seem to be little to no academic theoretical basis to this whole software security thing.

g_p
1 replies
23h21m

Given the challenges of techniques like TLS interception (i.e. through pinning and other good security features), about the only measure I can see left is network isolation.

You can set up a local network that has no WAN connectivity on it. About anything else is difficult to verify even the most basic of security properties. Certifying is another step up (although you could argue certifying is just a third party saying something passed a finite list of tests) - the real challenge is defining a meaningful certification scheme.

There has been some good work towards consumer IoT device security (i.e. the 13 steps approach from the UK), that covers some of the lowest hanging fruit - https://www.gov.uk/government/publications/code-of-practice-...

The trouble is that these set out principles, but it's hard to validate those principles without having about the same amount of knowledge as required to build an equivalent system in the first place.

If you at least know the system is not connected to a WAN, you can limit the assurance required (look for WiFi funcitonality, new SSIDs, and attempts to connect to open networks), but at a certain point you need to be able to trust the vendor (else they could put a hard-coded "time bomb" into the code for the solutions they develop).

I don't see much value in the academic/theoretical approaches to verification (for a consumer or stakeholder concerned by issues like these), as they tend to operate on an unrealistic set of assumptions (i.e. source code or similar levels of unrealistic access) - the reality is it could take a few days for a good embedded device hacker to even get binary firmware extracted from a device, and source code is likely a dream for products built to the lowest price overseas and imported.

BlueTemplar
0 replies
10h28m

We're not just talking about random consumer hardware : with security issues like these, I don't see why closed source software would not be just banned.

afh1
2 replies
1d1h

Smartphones don't count?

WaitWaitWha
1 replies
1d1h

Apologies, but do not understand the question.

Are you suggesting using smart phones should count in "not allowing it in"? Then yes, I try to where possible. I do not depend on a smart phone. All functionality that are operationally necessary can be done elsewhere without major delays or impact.

afh1
0 replies
20h50m

Interesting. How do you handle MFA, do you have a special device for that? Your bank/brokerage don't require their app?

Derbasti
6 replies
23h25m

There's a reason why I took my inverter offline after making sure that it was installed correctly. A cheap power meter now serves to measure my power generation instead.

DoctorOetker
5 replies
22h31m

Taking it offline doesn't protect against supply chain attacks in the form of built-in kill switches. A satellite could transmit signed instructions by modulating light below the noise floor, inverters must sense the voltage/current state of the PV panels anyway for MPPT to work.

Only deep inspection of the silicon and code can improve the situation.

Perhaps Western blocks could develop provably secure silicon IP and code, formally verified, and perform continuous random sampling on imported goods, including full multilayer silicon inspection; publish it for free and refuse to import products that don't cooperate.

BenjiWiebe
4 replies
21h36m

I'm curious about the feasibility of modulating light onto a solar panel. I feel it would not be feasible, except possibly onto a single panel at a time over a long time period. Just a gut feeling based off radio stuff (GPS).

DoctorOetker
3 replies
21h4m

GPS can provide the coherent reference, if you mean transmitting signal (say sound) while the panel is illuminated by the sun theres youtube videos of people doing that, with a laser pointer, but in sunlight and without information theoretic justified modulation scheme.

Nothing prevents the satellite to transmit the commands at night, if that feels more convincing to you.

Ask yourself what is the active area of a photodiode in your TV/... ? What is the active area of your light-bucket on a roof?

outsomnia
2 replies
12h11m

This is just FUD.

carapace
0 replies
4h9m

I thought so to at first but no, this person is not wrong. The situation is FUBAR.

DoctorOetker
0 replies
10h41m

I'm not at all claiming this is happening, I'm claiming this is very feasible to do. Consider the aperture (area) of a camera or telescope, how many times can you fit this area into a domestic solar panel installation?

If we are discussing the solar panel that powers your outdoor garden light, I don't suggest to apply the precautionary principle, but we are talking about products that sum up to a significant fraction of grid power generation.

dpedu
5 replies
22h0m

I can make my computer wildly vary the amount of power it is drawing by performing different things in software. Max out the CPU and GPU load and it will instantly change from drawing ~100 watts to 500 or more.

There have been plenty of botnets in the past. Some even in the millions of computers. If such a botnet decided to make every node's power draw fluctuate per above, wouldn't this cause the same type of problem? Is there a reason we've never seen this happen despite large enough networks of hacked machines existing?

berkes
4 replies
21h48m

The Netherlands (about which the article mainly is) has 8.4 million households, let's presume they own average of one such PC you mention. A delta of 400W would mean a total consumption delta of 3.36GigaWatt. That's "peanuts" to cover.

And that presumes an attacker can switch on/off all 8.4million computers in a small timeframe. 100% of them would need to be on, online and hacked.

I don't think this is a realistic problem.

Tesla F-ing up an OTA update that suddenly switches all charging Tesla's off, is probably a theoretical worse scenario.

dpedu
2 replies
21h3m

I don't doubt that that many watts is easy to cover - eventually. The problem is that it can be instantly turned on and off, whereas the grid takes time to shed load or add capacity.

I found a figure on Wikipedia saying that the NL's 4.7GW worth of offshore wind capacity is 16% of their total electricity demand nationwide. 4.7/.16 = 30GW total, so this theorized computer load attack would represent about 10% of their grid's total capacity. Can their grid add and shed that much load that quickly? That's the part I doubt.

berkes
1 replies
8h29m

You skipped over the part where I point out that my assumptions are completely off. These numbers presume that all computers in all Dutch households are hacked, running and connected to the internet. 5% of that would be on the high side even.

So a more realistic "attack" would be able to move demand, 0.5% of the total grid capacity. Switching on/off one smelter in an aluminium factory, is probably more than that. Hacking a major charging-station company and switching off their chargers is probably more than that even.

I understand the direction you think, and I agree that the combined power usage of "consumer devices" is big. But the larger power system is rather well protected by an attack on these devices through the diversity of these devices and the diversity of their setup (consumer firewalls, routers, individual protection, in-house fuses, local load killswitches etc).

The solar devises lacks this diversity, as the article mentions. There are few brands, and all of a brand need to connect to the one cloud service in the exact same way. So this does have a single point of attack. Whereas "switching on/off all personal computers in a country" is of an entirely different level.

dpedu
0 replies
37m

According to World Population Review and corroborated by other sources, the Netherlands has 91 PCs per 100 people. So your assumptions aren't that far off, actually. If anything, your assumption is lower than reality because a household is defined as 2 or more people.

https://worldpopulationreview.com/country-rankings/computers...

anentropic
0 replies
7h27m

Tesla F-ing up an OTA update that suddenly switches all charging Tesla's off, is probably a theoretical worse scenario.

and sounds like something that could easily occur

SnorkelTan
5 replies
1d

If solar panels can be turned off, why are utility companies having to sell excess power at a loss? Why can’t they tell the solar farms to reduce their output by the required amount?

trebligdivad
0 replies
1d

In theory someone somewhere should be incentivised to spend money on building storage systems so that they then have to pay less money in the future in excess days.

sanderjd
0 replies
1d

Solar power does get curtailed pretty often, but there isn't one uniform solution to the problem, different utilities / markets / grids have chosen different solutions to this.

pas
0 replies
21h28m

as far as I understand there's a market based solution. producers bid prices for time slots (consumers too, but that's less important from the perspective of a solar power plant) and if they win the contract is live, they need to input for that slot. if they miss (go over or under) they get paid less (and of course a penalty is possible too, theoretically it's the same)

this incentivizes better capacity and availability forecasting for solar installations, and preserves the usual dynamics of the open energy market.

..

the problem is with these super small ones, where initially states just let people connect it, because it's green, yey. (but now DSOs started to make connecting waay harder. and regulators are investigating, eg. in Spain. [0])

of course the non-residential installations already usually need aFRR capability. (eg. this is the case in Hungary.)

and there's already a market for "reserves" in the EU. (but the interconnection rate is below the target 15% as far as I know. but still, there are intra-state markets, etc.) and we can see that when solar is high the reserve prices are surging. [1]

[0] https://caneurope.org/content/uploads/2024/04/Rooftop-Solar-...

[1] https://gemenergyanalytics.substack.com/p/european-power-res...

kkfx
0 replies
1d

It's worst actually, at least in France, if you inject to the grid you have to pay an "energy transport fee", even if you inject for free (only recently self-made systems are allowed to sell energy, before they can only donate or not inject at all) and the injected energy is now paid less than the cheapest price to the customers (6 cent/kWh for ground based p.v., 10 cent for on-roof p.v.). So well, we do not harm large utility business.

What harm on scale is the variable output especially from small p.v. utilities built out of incentives NOT personal power plants, the grid is sized with some large power plants serving a large set of customers, their absorption vary but if the grid is vast (and not too vast) enough variation tend to be slow on average, let's say 50MW PP experience 100-200kW demand variation in very short time. They can compensate easily keeping the grid frequency stable. With a significant amount of grid injecting p.v. variation might be MUCH bigger creating significant stability issues where injection goes up too quickly making the frequency skyrocketing and large PP can't decrease their output fast enough risking disconnection witch in turn might put large p.v. plants offline suddenly creating a cascading effect of large blackouts.

That's the real issue with grid-connected and tied renewables and another reason why we need to go toward self-consumption NOT injection.

bjornsing
0 replies
1d

As I understand it: because the incentives are wrong.

Owners of small scale solar panel installations are payed a fixed price per kWh in many EU countries, regardless of the market price. The taxpayers pick up the tab I guess.

shermantanktop
4 replies
1d

This article repeatedly cites the need for personnel to have diplomas, certificates, and other ceremonial bits of paper.

This focus on paper qualification to mitigate risk seems a very European approach. Not saying it is wrong - it is just not emphasized as strongly elsewhere. And while it seems like a good fit for a slow-moving industry with high expectations of safety, the solar/wind world is not a slow-moving industry.

g_p
3 replies
23h15m

A good point - perhaps the focus is too heavy on paperwork or "measurable compliance".

From experience in this sector though, I think the real issue is a lack of technical awareness and competency with enough breadth to extend into the "digital" domain - often products like these are developed by people from the "power" domain (who don't necessarily recognise off the top of their head that 512-bit RSA is a #badthing and not enough to use to protect aggregated energy systems that are controllable from a single location).

Clearly formal diplomas/certificates are not needed for that - some practical hands-on knowledge and experience would help a lot there.

When a product gets a network interface on it, or runs programmable firmware, we should hear discussions about A/B boot, signatures, key revocation, crypto agility to enable post quantum cryptography algorithms, etc. Instead, the focus will be on low-cost development of a mobile app, controlled via the lowest-possible-cost vendor server back-end API that gets the product shipped to market quickly.

Let's not even go near the "embedded system" mindset of not patching and staying up to date - embedded systems are a good place to meet Linux 2.4 or 2.6, even today... Vendors ship whatever their CPU chipset vendor gives them as a board support package, generally as a "tossed over the wall" lump of code.

I doubt many of these issues (which seem to be commercial/price driven) will be resolved through paperwork, as you say.

shermantanktop
2 replies
16h5m

In the rest of the tech industry, what you did to get your diploma gives you about 18 months of momentum. If you haven’t learned multiple new technologies by that point, you’re in trouble. Success in this industry means perpetually redeveloping your own skills, and liking it.

How someone would wave a 20 year old piece of paper as evidence that they know how to use solar tech that was developed last year, I don’t know.

applied_heat
1 replies
14h46m

I mean, electrical engineering teaches you a lot of the math,physics,and control systems theory, and power systems that guides the design and operating characteristic of power systems devices like inverters. Sure EE doesn’t help with cybersecurity per se, but inverters and solar panels existed 20 years ago so I feel like my 20 year old electrical engineering degree is pretty darn relevant

g_p
0 replies
13h13m

It certainly does - if you remain current then not a lot has really changed.

If you understand the principles of control systems and how an electrical grid works, this is broadly "just" a grid stability concern.

To some extent this feels like an issue of IoT-ification of things that we otherwise understood just fine! Maybe the real issue is how we blend cyber security knowledge into other sectors, and quantify and ensure it is present?

leymed
2 replies
22h42m

Reading through comments I saw a lot of comments confusing cloud security with electrical safety of a system. Electrical protections are completely separate from communication line/ internet, has to be hard wired. As the size of plant/substation increases the automation and control system (again completely different thing from electrical protection) has its own internet system. Burning down substation, exploding transformer through solar panel is very very unrealistic.

On top of that PVs installed at homes are insignificant to cause such troubles. As the size of installation increases, you will have different connection agreement and certain requirements. You can't install 15 MW and connect through inverters that are used at home, which is 100 kW at most. Even 15 MW is insignificant change for a grid.

BlueTemplar
1 replies
10h15m

You're the one that seem to be missing something, we're not talking about local electrical safety, but about the global grid stability, with a hacker potentially hijacking software controlling tens of GW, spread out over many personal home installations of solar panels.

leymed
0 replies
4h20m

My point is hacking into home PV inverters doesn’t affect grid stability, you can’t penetrate into grid in that way. At worst we’re talking about losing power for a short time at those homes. When the demand is planned for a region you specifically exclude PV for load flow studies.

adolph
2 replies
1d1h

The short version: most consumer and business solar panels are centrally managed by a handful of companies, mostly from countries outside of Europe. In the Netherlands alone, these solar panels generate an output equivalent to at least 25 medium sized nuclear power plants. There are almost no rules or laws in Europe governing these central administrators. . . . The same thing goes for heat pumps, home batteries, and EV charging points.

Seems to me that this is very similar to the situation with IoT only with higher stakes. I appreciate this article's presentation of inverter and grid trust.

Beyond trusting customer inverters to do the right thing, I wonder if there is a method for safing a grid at the hardware level. Naive question: could there be a grid provider device that prevents overcurrent or incorrectly clocked cycles?

kwhitefoot
1 replies
1d1h

The utility company fuse between the property and the 240 V distribution system should prevent overcurrent. If the frequency or phase of the inverter is wrong the inverter might die first unless the network is already down.

There isn't really any practical way to prevent overvoltage though. So a rogue controller in charge of all the solar systems in a street might be able to do quite a lot of damage to consumer devices.

A problem from the utility point of view is that they can no longer guarantee that the 240 V side of the distribution system is safe to work on just by tripping a breaker on either side of the distribution transformer. So all work on the 240 V distribution system has to be done with the assumption that the system is live.

Eventually regulations will be updated, if necessary, to deal with large numbers of solar installations on domestic buildings.

cesarb
0 replies
22h54m

The utility company fuse between the property and the 240 V distribution system should prevent overcurrent. If the frequency or phase of the inverter is wrong the inverter might die first unless the network is already down.

To put it more simply: if the phase is wrong, the effect is the same as a short circuit, which fuses and circuit breakers protect against. If the frequency is wrong, the phase will become wrong after a number of cycles.

There isn't really any practical way to prevent overvoltage though. So a rogue controller in charge of all the solar systems in a street might be able to do quite a lot of damage to consumer devices.

There is, it's called a surge protector or surge protective device (SPD). It converts any overvoltage above a certain level into a short to ground, which then trips the fuse or circuit breaker. It's often used as a protection against lightning-induced currents.

A problem from the utility point of view is that they can no longer guarantee that the 240 V side of the distribution system is safe to work on just by tripping a breaker on either side of the distribution transformer. So all work on the 240 V distribution system has to be done with the assumption that the system is live.

From what I've seen, the utility workers usually ground the wiring when working on it (they have a special-purpose device for that). Once it's safely connected to ground, it's no longer live.

twoodfin
1 replies
1d1h

Isn’t the right place to fix this at the junction between the plants and the grid? Regulate the grid utilities into a gateway role, and require all inverter control & telemetry traffic to pass through them.

This seems likely to be more fruitful than attempting to regulate 400 Chinese panel manufacturers.

What am I missing?

itishappy
0 replies
1d1h

You're thinking about this right, just at a utility scale.

The "plants" in this context are homes and businesses. The junction points between plants and grid are the inverters sold by the panel manufacturers.

timClicks
1 replies
1d

Does anyone know of an inverter manufacturer that doesn't require this? Ideally, one that offers micro inverters for each panel.

leymed
0 replies
23h5m

In greater scale, meaning power plants not the PV installed at houses, these things are taken more seriously and after purchase of equipment the control and automation of plant are in your hands. For example, Woodward, ABB have products with capacity up to 0.5 MW of single inverter.

Micro inverter for each panel would be very costly. In 1 MW plant you will have around 4000 panels, communicating with that amount electronic devices would be a headache.

formerly_proven
1 replies
1d2h

Most newer solar inverters can't even be set up without internet and most functions are only available with an always-on internet connection. This is also true for EU companies like SMA for example.

grecy
0 replies
1d1h

I just installed a Fronius inverter (made in Austria) and 6.8kW of panels.

The inverter itself functions perfectly fine without an internet connection, and will display instantaneous power output on the screen. I could just be content with that and look at my monthly power bill to see how much I generated and how much I used each month and never connect it to the internet.

To get any kind of data logging & history from the inverter, it must be internet connected (wifi or ethernet). And all of that is through the manufacturer's website, which constantly nags me to "upgrade to pro" for some obscure feature that I'll never use.

bww
1 replies
21h15m

The author seems to imply, as if it were generally understood and accepted, that the reason nuclear reactors are heavily regulated is because they produce a lot of energy.

Perhaps that's a component, but one really doesn't need to think about it too hard to identify better explanations for why this particular energy source is held to unusually high regulatory standards.

I don't have an opinion as to whether other large-scale sources of energy should be held to similar standards, but to suggest that solar energy's failure modes are comparable to nuclear energy seems intentionally misleading.

gwbas1c
0 replies
21h8m

Here's the critical point:

In the Netherlands alone, these solar panels generate a power output equivalent to at least 25 medium sized nuclear power plants.

Because everything runs through the manufacturer, they are able to turn all panels on and off. Or install software on the inverters so that the wrong current flows into the grid. Now, a manufacturer won’t do this intentionally, but it is easy enough to mess this up.

As an interim step, we might need to demand that control panels stick to providing pretty graphs, and make it impossible to remotely switch panels/loaders/batteries on or off.

Basically, if a hacker were to make all batteries (or panels) suddenly switch between full discharge and full charge every second or so, it would tear down the electric grid. Voltage and frequency would swing rapidly, and whatever plants are riding load would struggle.

This could create a massive power outage; but there is a huge risk that this could damage power plants and other infrastructure.

ThrowawayTestr
1 replies
1d1h

If the general public knew how fragile the power grid is no body would be able to sleep at night.

asynchronous
0 replies
22h50m

Not to be that guy, but the DOE is arguably one of the most important federal agencies in the US, and they treat the problem with the correct amount of focus, research and dedication. It’s just a very hard problem. The grid is no less secure or less resilient than it was 50 years ago, the main problem is that people are more dependent on it. Almost no one buys a personal generator before an outage happens anymore, despite it being one of the cheapest ways to get resiliency.

shahzaibmushtaq
0 replies
1d

The second figure explains a lot like everything.

Cloud-based management platforms should not oversee inverters directly.

samstave
0 replies
1d2h

I posted this question to HN 7 months ago, more around DataCenters:

In the increasingly interconnected global economy, the reliance on Cloud Services raises questions about the national security implications of data centers. As these critical economic infrastructure sites, often strategically located underground, underwater, or in remote-cold locales, play a pivotal role, considerations arise regarding the role of military forces in safeguarding their security. While physical security measures and location obscurity provide some protection, the integration of AI into various aspects of daily life and the pervasive influence of cloud-based technologies on devices, as evident in CES GPT-enabled products, further accentuates the importance of these infrastructure sites.

Notably, instances such as the seizure of a college thesis mapping communication lines in the U.S. underscore the sensitivity of disclosing key communications infrastructure.

Companies like AWS, running data centers for the Department of Defense (DoD) and Intelligence Community (IC), demonstrate close collaboration between private entities and defense agencies. The question remains: are major cloud service providers actively involved in a national security strategy to protect the private internet infrastructure that underpins the global economy, or does the responsibility solely rest with individual companies?

---

And then I posted this, based on an HNers post about mapping out Nuclear Power Plants:

https://news.ycombinator.com/item?id=41189056

[We can easily map the infrastructure of the cloud and AI -- and their supply chains - and these are increasingly of National Security Concern:]

((Not to mention the actual powerplants being built to exclusively provide datacenter power))

Now, if we add the layers of the SubmarinCableMap [0] DataCenterMap [1] - and we begin to track shipments

And

https://i.imgur.com/zO0yz6J.png -- Left is nuke, top = cables, bottom = datacenters. I went to ImportYeti to look into the NVIDIA shipments: https://i.imgur.com/k9018EC.png

And you look at the suppliers that are coming from Taiwan, such as the water-coolers and power cables to sus out where they may be shipping to, https://i.imgur.com/B5iWFQ1.png -- but instead, it would be better to find shipping lables for datacenters that are receiving containers from Taiwain, and the same suppliers as NVIDIA for things such as power cables. While the free data is out of date on ImportYeti - it gives a good supply line idea for NVIDIA... with the goal to find out which datacenters that are getting such shipments, you can begin to measure the footprint of AI as it grows, and which nuke plants they are likely powered from.

Then, looking into whatever reporting one may access for the consumption/util of the nuke's capacity in various regions, we can estimate the power footprint of growing Global Compute.

DataCenterNews and all sorts of datasets are available - and now the ability to create this crawler/tracker is likely full implementable

https://i.imgur.com/gsM75dz.png https://i.imgur.com/a7nGGKh.png

[0] https://www.submarinecablemap.com/

[1] https://www.datacentermap.com/

pshirshov
0 replies
18h27m

Cloud connectivity in Victron products is optional and disabled by default. Also there is a read-only mode.

The hardware is modular and the software is above any competition. Choose the responsible vendors.

lysecret
0 replies
21h15m

Same is true for heat pumps.

kybernetyk
0 replies
12h47m

No regulation? How will the private sector only do its job if we don't put it under control of incompetent bureaucrats?!

kuon
0 replies
23h48m

My installer put a solaredge inverter, it took some real efforts to keep it off the cloud while injecting the data in my grafana. I can do it because I am a network engineer, but it should be easier.

Anyway, I agree that there should be a regulation that forbid remote management, and you can only consult data in a read only manner remotely (you could air gap the inverter with the internet gateway using a one way rs232 connection where the inverted just write continuously). And if grid operators need to be able to turn solar off, they should install relays controlled by their infrastructure.

kkfx
0 replies
1d

That's why my system (Victron + Fronius) is offline, monitored with HA, BYD battery if there is no secret in-hw backdoor in my home server can't reach the internet as well. HA can, via wireguard, to act/monitor when I'm outside my home witch might be a serious threat but it's pretty easy to cut it off if needed.

There is a more important part, while with p.v. we still can go offline, with car's we can't. My car is connected and I can't do NOTHING to manage it, it's managed by it OEM behind me and that's a much bigger threat since single cars can paralyze the nation if properly blocked in critical points of the road network.

At a largest scale that's the reason we can't have a national smart grid but only individual smart microgrid, meaning p.v. should be used only for self-consumption NOT grid-tied like in California.

isoprophlex
0 replies
22h47m

If the west for some reason starts to vigorously argue with China over something, we're all completely fucked. They'll just tell our cheap EVs to forget how to brake, melt the firmware in our cellular towers/chips, and toggle our PV inverters off and on at a shitty time.

davedx
0 replies
20h46m

Wait what. I don’t know if my inverter does what they say. For one thing the vendor went bankrupt so there is no cloud dashboard anymore. For another there are hundreds of inverter vendors not one single one. And I am highly sceptical the basic dashboard showing solar generation has some sinister inverter backdoor killswitch when the article seems to provide no evidence of such? Seriously?

Edit: did some research and apparently it varies - many modern inverters can be remotely controlled by manufacturers - if they’re setup to allow it and are internet connected.

The article is still sensationalist about the risks though

amai
0 replies
15m

Did I read Huawei in the list of solar power plant management providers?

_trampeltier
0 replies
1d

I don't remember when and where exactly (and didn't found it in a quick search), but there was already an incident, where an automatic update failed. I think it was something with the country code, so it was a bit isolated and not all over the world.

Kon-Peki
0 replies
1d1h

Incidentally, why are all those panels centrally connected anyway? I’d like to know what my panels are doing, but you don’t need the internet for that.

This is because of the market for carbon credits. When you installed your PV panels, someone estimated how much electricity they would generate over the next 10-15 years. Tradable carbon credits were created based on that estimate and went into the marketplace. And for the next 10-15 years they have to verify that the electricity was actually generated, or else someone has to pay back some money. Did you read the fine print on your contract? It is probably you that has to pay it back. You didn't know that one of the "rebates" you got was actually a pre-payment for those credits?!? Should have read the fine print ;)

Oh yeah, BTW: that "rebate" was only your portion of the credits. The installer got some of it (and doesn't have to pay back anything), the person that filled out the paperwork you didn't know existed got some (and doesn't have to pay back anything)...