This is the sort of thing that almost (but not quite) pushes me to support mandatory certification (enforced by patents/other IP protections) for consumer devices that use a familiar connector like USB.
Consumers shouldn’t need to worry about the specific output voltage; the presence of a USB-A connector should indicate that it abides by the specification.
The hacker in me hates the idea of enforcing something like this, but poor interoperability is such a pain that it would be nice to have stronger guarantees.
I think when it comes to safety restrictions should be there. The hacker in me hates it too but just imagine if a small portion of standard-looking outlets did not output 220V (or whatever standard you have) and say 330V.
Many applicances would break, or perhaps even catch fire.
If there were many different USB-A devices and people were used to checking the voltage output before plugging, that might not be needed, but since probably none of us check for a USB-A port to see if it outputs 7.5V, it better be mandatory to be certified.
Though, maybe it is, but still those cheap Chinese adapters slip away.
From my experience appliances tolerate a wide range of voltages. The outlets in my house routinely deviate from the standard 230V, with the lowest I have seen at 170V and highest at about 260V. I have not had any appliance malfunction.
Edit: I am not advising anyone to deliberately do stupid things, simply mentioning that from my experience the builtin margin is fairly large.
You (or your landlord) should probably hire an electrician to get that checked out – especially the higher voltage might be indicative of some scarier failure modes.
A voltage drop of 25% could also mean you have semi-broken wiring somewhere, which can cause fires or shock hazards on faulty appliances.
Dips and spikes are routine in electrical power delivery. In my experience, dips are much more likely to be the culprit of damaging equipment than spikes. People are used to protection from spikes with surge protectors, but the only thing that protects from dips is a UPS. I use a small UPS as part of my entertainment equipment. Not because I want to watch TV during an outage, but to keep from damaging the equipment with the power dips.
How would a power dip damage an appliance?
For AC motors, the lower voltage might mean they do not have enough power to turn, so all the power they consume becomes heat in the motor windings (and since they're not turning, there's less cooling).
For electronic power supplies, they might either pull more current to maintain the same output (again leading to more heating), might output a lower voltage (and DC-DC converters which consume that voltage might in turn draw more current, again leading to more heating), or they might even misbehave and output an oscillating voltage; they might also detect the power dip and shut down, only to power up again moments later, repeatedly.
A device in an idle stand-by mode is probably the best case, since it's using little power from a supply sized for a much bigger load.
I disagree with the sibling comment that the only protection from voltage dips is a UPS or similar; a simpler protection is an undervoltage relay with a timer, to convert the voltage dip into a power outage (and also prevent the power from being restored too quickly).
* The only protection available as a consumer add-on.
Sure, if the manufacturer wants to increase the cost on the BOM for their PSUs, they can use additional components to survive dips, but who does that on disposable electronic equipment? Hell, even "expensive" TVs don't. I quote expensive since you can now get 65" 4K TVs cheaper than a mobile phone, but that's a tangent. So if you are wanting to protect your electronic equipment with another device readily available that is just plug an play ease of use with no knowledge of electronics, what do you suggest?
I have a friend who did that; he put one of these relays on the apartment's power panel, so the whole apartment was protected against voltage dips or too short outages. It might not be a common consumer item, but it's a readily available device, AFAIK commonly used to protect industrial motors.
common industrial does not come close to common consumers.
adding something to the power panel is not something a consumer can do. how is this even being confused?
If it's turned off or it is on and not being used, probably not much. But more and more equipment no longer has an off mode and is more of an idle stand-by mode, so current is still always being used by part of the system.
The point is that in electricity delivery, it is normal to have deviations in the voltage as acceptable. Most power supplies have tolerances that can handle these deviations. A dip is something outside of the deviations that most people don't consider as they are only focused on spikes and only use surge suppression devices. Since the only protection from dips is a more expensive battery solution, most people do not bother outside of computer related usages.
Ah, so e.g. something like a device flipping between “powered” and “not powered” states, which then wears out some part that doesn’t expect that state change to happen so often?
no, not quite. a dip is some lower than the accepted deviations from the expected value. if you have an accepted range of +/-10%, but the dip causes the voltage to drop 15%, that's a dip. some places define an outage as being less than 5% of expected value. so somewhere before outage, you have a severe undervolt situation where the equipment struggles. the longer the dip lasts, the worse things can get.
one of the shops that I worked at suffered a catastrophic situation when a nearby construction crew cut one phase of the 3-phase power coming into our facility. the poor transformer died, and took out the some of the more delicate PSUs attached to some very expensive equipment. we were down for weeks recovering from that.
It's not power dip, it's voltage dip. An appliance will often draw the same power at the lower voltage, so increase current, which can cause overheating and damage.
Typically in these dip situations, you can hear the PSUs being very unhappy about the situation. A cringe inducing high pitched squeal can be heard from any PSU that's doing work as they try to continue doing their job while in this undervolt condition.
I did but unfortunately it quickly turned into all parties involved blaming each other (it's your utility, it's your installation...) and it got nowhere. For the past year it's been much more stable than before so I don't worry too much about it.
Make sure you have good insurance and multiple loud smoke detectors...
Insurance won't cover if you knew there was a fault and didn't fix it!
Do you have a citation for that?
Sure. This is the relevant clause in my current home insurance policy:
"You must take steps to maintain the Home in a good state of repair and take all reasonable precautions to avoid loss, damage or injury and to safeguard all property insured from loss or damage."
These things are pretty standard - I would be very surprised if there isn't a similar clause in all of them.
Which implies that, if you fail to do so, they don't have to pay up.
The wording in your policy may be different. What are "reasonable steps" will depend on legal precedent. It may, for example, take into account that you're not an expert, and will probably include some reasonable time to get stuff fixed. But certainly it provides them the opportunity to deny a claim.
Obligatory disclaimer: I am not a lawyer or insurance expert, get your own advice if you need it.
It seems plausible that not addressing the electrical anomaly described by nicolaslem could be called a failure to maintain the home under your policy.
For completeness, the relevant section from my policy is "We will not pay for any loss to the property...as a result of...Neglect, meaning neglect of the insured to use all reasonable means to save and preserve the property at and after the time of a loss, or when property is endangered". But in the same context, it says "However, we will pay for any resulting loss from ... defect, weakness, inadequacy, fault, or unsoundness in ... design, specifications, workmanship, repair, construction, renovation, remodeling, grading or compaction; [or] maintenance."
It would be shaped by the legal precedent in interpreting these sorts of clauses, and perhaps also the whims of a jury. It still seems unlikely to me that either of our policies would end up excluding the electrical anomaly, and moreover I've yet to ever hear a verifiyable account of someone whose homeowners insurance claim was denied after an electrical problem in their house led to a loss.
I don't know if that exact case can be found, but it's a fairly obvious generalisation of the general principle. that if failure to repair in a timely way a fault or delapidation can result in reasonably forseeable further damage, insurers will be very reluctant to write policies cover the further damage; because it would result in moral hazard in the purchaser of insurance. So I would personally not like to rely on it being covered (apart from the obvious risk to the person of the house burning down).
That's not to say that the original electrical fault wouldn't be insured - that depends on other terms.
Your policy does seem to be more lenient than mine in that it allows "resulting loss from " ... "inadequacy or unsoundness in" ... "maintenance". I wonder what the scope of that is in reality.
Nobody can provide you with a citation because the specifics will be the wording that's in your individual policy.
But, generally, policies require you to maintain the property.
See above about a broken neutral. A broken neutral is bad. One could easily load a system or even just have wetter soil so that the voltages are well balanced for a while despite the neutral being open, but that doesn’t make it less bad.
I would not assume that this fault is not dangerous TBH. May well be a good idea to hassle them some more. If you're renting, you don't need to care who is at fault - your landlord needs to sort it. If you own, you may need a 3rd party electrician to settle the argument.
My first guess would be something like a broken neutral wire, making the neutral voltage depend on the imbalance of the load between the AC phases. From the little I understand about it, you could have up to the phase to phase voltage between the broken neutral and one of the phases, with the voltage dynamically changing as devices are turned on or off. It's very confusing (since everything expects the neutral to be referenced to the earth, which is constant 0V by definition), and can burn up devices.
That definitely reads like somebody removed/forgot the neutral bond.
You're understanding is correct, if neutral is allowed to float it's voltage will depend on ratio of the load on the 2 phases.
It also creates a variety of hazardous conditions and opportunities to make spicy metal.
You can play with it here courtesy of Paul Falstad and Iain Sharp: https://tinyurl.com/yq4lukm6
Are you measuring RMS voltage or just the instantaneous voltage?
An RMS voltage of 230V corresponds with a peak voltage of ~325V so reading of 260V could theoretically happen.
That's the reading from my UPS, I believe it's RMS since it reads 234 V currently.
That's not how it works. For a sine wave the integral converts to Vpeak/Sqrt(2)
Thats quite a spread. Is it up to spec where you are?. In Australia we have 230V nominal, but the allowed range is from 216 (-6%) to 254 (+10%).
Anything with an AC/DC transformation would be fine, anything with an AC motor would be not. Partially, not the every compressor in a refrigerator would tolerate that.
But 170V... yikes!
Most of them are either 110-240V appliances, or lose performance when it goes around 170, and you may not notice.
260V is in +/- 10% threshold and is possibly falls within "engineered in tolerance" range, but will probably shorten the life of many appliances if supplied constantly.
Lots of devices have switched mode power supplies that will still output the same voltage. Last time I had a brownout the only thing that stopped working was the washing machine, and I thought that was broken.
Paradoxically, it may be the lower voltages that are more dangerous, because it will result in a higher current draw from all the devices.
Mains voltage has a nontrivial allowed range (in fact in Europe they increased it slightly so that devices would work across countries with slightly different original standards) but this is well out of spec.
We have something like that here in Brazil. On cities which use 127V as the standard voltage (some cities use 220V), you can get 220V by using a pair of phases instead of a phase and the neutral, and it's common to find a few standard-looking sockets which are 220V instead of 127V. They might be colored red, they might have a yellow sticker saying "220V" next to it, or they might be completely unmarked. If your device is 127V only, or it has a manual 127V/220V switch which is on the 127V position, and you plug it into one of these 220V sockets, it will be damaged.
(The standard we use for consumer AC power plugs and sockets, NBR 14136, does not make a distinction between voltages; the same plugs and sockets are used for both 127V and 220V.)
That sounds incredibly error prone and actually quite dangerous! At least a lot of AC to DC converters are universal voltage and frequency, but not all appliances!
In Australia we have only one voltage that will ever come out of a single phase socket, but the standards took the opposite approach with current ratings - the sockets are cleverly designed in terms of their current capacity so (assuming it’s been installed correctly) you can’t make mistakes without illegal and dangerous modifications. In the standard socket, you can have 10 amp, 15 amp or 20 amp, and they all have different sized pins. For the 15 amp plug and socket, the ground pin is larger, so a 15 amp plug cannot fit into a 10 amp socket. A 20 amp plug has all three pins larger, so it doesn’t fit in a 10 amp or 15 amp socket. But a 10 amp plug can still fit and sit snugly in a 15 or 20 amp socket, and a 15 amp can still fit into a 20 amp socket with no problems. It’s pretty clever.
For bigger stuff, or anything three-phase, you then have big industrial circular ones, which I think have a similar system to not let you plug a higher current device into a socket that can’t supply enough current. I believe there are also three phase plugs and sockets both with neutral and without neutral, and if the plug has a neutral pin it doesn’t fit into a socket that doesn’t supply neutral.
That sound dangerous. The larger fuse sized for the 20A socket will not protect the smaller conductors in the cable sized for the 10A plug, they can get overloaded.
The three-phase / industrial CEE plugs and sockets used in the EU are sized for 16A/32A/63A/125A. It's perfectly legal and safe to make a passive adapter that lets you plug a 32A appliance or distribution box into a 16A socket, you just can't load it higher than the 16A fuses on the socket. Going the other way around (16A plug into a 32A socket) you MUST have a circuit breaker in between to provide the required overload protection.
It's not dangerous in practice. Reading through NEC in the US, there's plenty of leeway for mixing ampacity ratings on circuits, within certain limits. For example, it's fine to have 10x 15A duplex on a 20A circuit. It results in less wiring, and allows for logical grouping of circuits. Electrical code starts from the the most common causes of electrical failures, and tries to design down from there. The ampacity ratings on devices and various components of the circuit are designed to reject the common causes of failure, and conductor failure at the device isn't one of them (appliance manufacturers are more likely to get their devices certified, so if there's a failure in the appliance, some other part of the appliance's electrical system will fail before the conductors do).
So the ampacity of the connector (duplex and plug) is meant to reject certain combinations (plugging a 30amp appliance into a 15amp duplex/circuit). It's obviously very easy to get around that by just wiring things together wrong (18AWG wire on a 30Amp circuit), but by design, a 30amp appliance with the proper plug cannot be plugged into a 15amp outlet.
Of course, this is informed by the electrical code in my home country, so I understand that other places and people have different experiences, but electrical accidents have been on a steady downward trend since the 80's, so the NFPA NEC has to be doing at least some things right.
Is "ampacity" really a better word than "current"? And the unit symbol is just "A", like "30 A".
I realize I'm nitty but this is a technical-minded forum after all.
"ampacity" means "capacity in amperes" or more literally "ampere capacity." You could say current capacity too but it takes longer. Ampacity is the language used in the electrical code.
"amp" also seems to be more common to write than "A" for electricians. Remember that the SI standardization of unit abbreviations is actually a pretty modern thing, so lots of fields existed before unit abbreviations were standardized and continue to use existing practices.
Ah I did not get that meaning, then obviously it's a perfectly cromulent word. Thanks and TIL.
Nah, it's fine to be nitty. If you glance at my comment again, you'll see my usage devolve from A to amp halfway through, my brain can't help but write what it sounds like to me.
Yes though, ampacity is the correct word (for my home country). "Current" is an instantaneous measurement of current flow. That they use the same unit is pretty convenient for planning circuits. https://en.wikipedia.org/wiki/Ampacity#:~:text=Ampacity%20is....
It is indeed very regional, e.g. the UK has its own thing going on with ring mains and fused plugs.
Over in my 230V corner of the EU, CEE 7/3 sockets are pretty much the only thing I'm aware of for new wiring, with the unearthed CEE 7/1 sockets still present in older installations. Both are rated for 16A, with either 10A/1.5mm² or 16A/2.5mm² branch circuits from the distribution board. It would be unheard of to wire those up with any higher ampacity at risk of overloading the sockets - although recent experience with e.g. EVs has shown that they're not actually suitable for sustained loads at 16A, given e.g. wear and tear on the sockets.
Unearthed appliances do commonly have CEE 7/16 plugs rated for 2.5A, particularly things like wall warts or USB chargers, which does fit into a CEE 7/3 socket.
I recall seeing CEE 7/16 (2.5A europlug) sockets, although those may have been imports from elsewhere in the EU. I suppose those would have the same issues if wired up to the same distribution circuits. But extension cables for those are rarer, the only ones I've ever seen are CEE 7/7 plug (16A) -> CEE 7/16 sockets (2.5A).
For residential/commercial loads over 16A / 3.2kW, it's all CEE / IEC 60309 connectors, and those are treated like distribution circuits with overload protection sized appropriately for the connectors/cables.
That doesn't make sense though. The fuses aren't to protect the load; they are to protect the circuit. It's up to whatever device you are plugging in to not draw more current than it can handle.
If I have a 3 Watt doohickey, I don't go looking around to find a 0.025A socket, I plug it in to my 15A wall socket and everything works great.
Yeah, the risks mainly come with extension cords, like plugging a bunch of 10A appliances into a single 20A outlet via a 10A rated extension cord with multiple 10A outlets.
Appliances themselves with their own input fuses and captive cables are less risky, although some kind of internal wiring short could still get all melty.
It's true that sizing the plugs that way does prevent you from plugging a 20A appliance into a 10A extension cord, which makes sense.
Each individual socket isn’t fused though. The entire circuit is. The fuse(breaker) is going to be significantly bigger than any of those amperages.
The breaker is there to protect the wiring in the building, nothing more. It’s spec’d to the gauge of the cable in the wall typically.
Brazil should use outlets to indicate the voltage. With Type A for 127V, the worldwide standard for 120V plugs. Then Type N for 220V, where there is no standard but Type N is pretty good design. Changing outlets is pretty easy to do.
That would be going backwards, and losing all the safety features of the new NBR 14136 ("type N") standard; to make it worse, we already did use something like "type A" (with "universal" sockets which accepted both flat and round pins, and sometimes even grounded "type B" together with these two, it was a mess) for both 127V and 220V, so you'd still have (in older buildings) these "type A" plugs being used for 220V.
Yeah, Brazilian NBR 14136 does that too. It has 10A and 20A variants, and the only difference is that 20A has slightly larger pins, so it won't fit into a 10A socket. (And the international standard it was based on, IEC 60906-1, has only a 16A variant with an intermediate pin size, so the 10A plug would fit into its socket and its plug would fit into the 20A socket.) Annoyingly, that's the only difference, making it quite hard to tell just by looking whether it's a 10A or 20A plug/socket; you have to try and see if it fits.
In my old house (US) there was an outlet that was setup that way. Two phases plus ground on a bog standard looking 120V outlet. Made the discovery when we plugged a 500W work light into it while renovating and it exploded after a few minutes.
So you had a glorious 2kW work light for a few minutes.
Yeah, the wiring code says the 220V outlets on 110V areas must be red. Marking is optional.
But I've actually have never seen a red one.
That's odd, because in a lot of contexts (hospitals, EMS, etc.), red outlets are used to denote outlets that are connected to battery/UPS, i.e. critical loads, versus the regular, which can be 'sacrificed' in an outage.
Plus there’s 240V in Paraguay. I travel with cheap universal adapters and never have any problems.
Overvoltage is now routine in places where there's lots of PV installed in homes and not enough load nearby.
If you live in such a place, you have a deep appreciation of varistors and should have either good insurance or a little bit of soldering skills... preferably both.
Every solar inverter / microinverter manufactured in the past two decades (if not older) must monitor the grid voltage+frequency and disconnect if it falls outside of a certain tolerance. If things go out of tolerance there is a defined period of time where it must recover or the inverter is required to trip out. There are also hard safety limits that cannot be exceeded whatsoever. I am not aware of any conditions that would result in overvoltage.
In CA as of (IIRC) 2021 these systems must be hooked up to some kind of online monitoring so the utility can temporarily command them to exceed tolerances. If the grid is going unstable due to load the last thing you want is everyone's PV system tripping off making the supply situation worse. So the utility can inform the system to exceed the normally tight tolerances by a larger deviation. They can also command PVs systems to shut off but there are strict limits on how often and how long they can do that and it must be driven by grid stability needs.
Yes, absolutely, same here.
The problem is varistors blow out before the safety measures work. Perhaps due to wear or low quality, but this still happens.
Every once and a while an electrician wires up standard US outlets to 480v 3-phase instead of 208v 3-phase, so you end up with an outlet designed for 120v that puts out 277v.
Sometimes noone notices because the switching power supplies plugged into them are just fine with this.
It's got a CE mark, but it's unclear if it's valid or not. This is a mandatory certification in the EU, like the UL listing in the US.
The device is specifying the output voltage, which isn't in spec for the connector, but it is (apparently) accurate.
UL isn't actually mandatory in the US. The only thing mandatory that I am aware of is FCC certification for devices with signals over a certain number of Hz, and they wouldn't care about a USB port being abused.
Good luck filing an insurance claim for damage caused by the use of a non-UL-listed appliance.
What language in an insurance policy requires the use of UL-listed appliances for coverage?
None.
Stuff like the post you are replying to is common "good luck with insurance" trope. Insurance protects you from stupidity, errors, and negligence. It doesn't protect you from fraud or intentional malice, but using an uncertified device is not fraud.
Note that after an accident related to personal stupidity or error, an insurance company could decide you are too risky to further insure, or that the cost for your insurance should increase dramatically. But they can't deny a claim.
All of this is of course a significant summary, but insurance won't (can't?) deny a claim linked to an uncertified electrical device in normal circumstances.
I used to work for a major clothing retailer. Our insurance policies did require us to use UL certified appliances in our stores.
Did you read the policy, or is this what somebody told you?
I read somewhere, that there's no such thing as a UL-approved turkey fryer.
https://www.ul.com/news/put-safety-menu-thanksgiving
I think that is still the case.
Every turkey frier I've ever seen is a propane burner and a well sized pot to put on top of it. 0 electronics unless they have a thermostat type feature (which I've never seen).
I’m pretty sure UL applies their testing to anything with safety ramifications.
Gotcha, I thought it was just an electrical certification. However if they won't certify a propane burner and a pot just because it's labeled 'turkey frier'... i'm not sure how they can certify any gas grill, gas range or propane burner of any kind.
Has to do with the large amounts of flammable oil that inevitably spill out.
Some years ago, I saw a video, done by Consumer Reports, I think, that showed a fireman, in full heat gear, using a turkey fryer, and showing how incredibly easy it is to spill oil.
Every Thanksgiving, there are always quite a few turkey fryer accidents.
https://www.osha.gov/nationally-recognized-testing-laborator...
I recently ran into this buying an EVSE to charge my car. Amazon in the US is full of imports from generic Chinese brands that are half the price of name brands but not UL certified. Many of them say "UL certified cable" which doesn't tell you anything about the entire charging unit, just the wire stock they used.
It's all fraud. And at 40 amps, potentially quite dangerous.
Some devices do require a NRTL (nationally recognized testing laboratory) mark. That could mean UL or TUV or whatever. NFPA 70 will refer to “listed” or “recognized” devices in such cases.
There's a deceptively similar mark that stands for China Export and does not indicate EU certification[0].
[0]https://www.kimuagroup.com/news/differences-between-ce-and-c...
There is no such thing as a “China Export” mark. No products has been shown to have it. It’s a fake urban legend: <https://en.wikipedia.org/w/index.php?title=CE_marking&oldid=...>
That's so interesting.
I've just done a quick dive into my electronics. I have a Tonbux power strip where the C and E are definitely too close together.
Edit:The website I linked is a Chinese manufacturer. If it was an urban legend, it seems to be being treated as fact in a way that matters now.
As per Wikipedia, quoting the European Commission, people displaying the CE logo with incorrect dimensions exist. This does not prove the legend.
Please stop spreading urban legends on the basis that other people believe in them.
I'm not interested in arguing you out of your surety. I do want to note that I believe that when an urban legend is implemented in reality, it is no longer a legend. And you aren't going to argue me out of expressing my opinion.
Wow, that's incredibly close. I assumed there would be a public database of CE (European) approved devices or manufacturers, but I can't find one from a quick search.
The CE marking is a self certification program, and there isn't an approval process. All the can happen is that you can get fined after the fact for applying the mark to products which don't fulfil the necessary requirements.
Technically it is a self-certification program, since it's the device manufacturer who signs the certificate, but that's not the whole truth. A manufacturer still needs to prove compliance (in the case of the EMC directive, either Annex II or III) or slapping on the mark is just fraud.
CE mark (the real one) is self certified
For context, this isn't an official mark created by China or any other entity.
It's just some fraudulent manufactures that are using a paper-thin excuse to illegally apply a fake CE mark.
Looks like CE to me: https://u.ale.sh/iaN5ie.png
It is a self-certification mark though so ¯\_(ツ)_/¯
It's mandatory to have the CE mark.
It's a self-certification. You can decide how much that's actually worth for non-EU vendors.
Fair point, but given that some manufacturers have no problem at all to just blatantly use brand names without permission, having a CE mark is unfortunately not a sure sign that a device has ever been inside a certification lab…
Anyone can stick a CE sticker onto their products. It's a self certification mark, and I've never heard of a company ever getting in trouble over this stuff. All it says is "I promise to follow the CE rules" but there are no mandatory audits or anything, unless a company gets found out to be violating the spec.
I don't think the CE mark protects consumers the way it could decades ago, with international imports taking a few seconds and free postage to boot.
The device is specifying the output voltage, in very black, very tiny letters. How convenient. The person who took this almost impossible photograph should get a Pulitzer. No quantitative measurement for the real voltage, or current for that matter.
https://donglec.com/blogs/journal/are-third-party-chargers-h...
This.
Nintendo comes to mind; charging the Switch (the earlier models IIRC) with a third-party adapter was a lottery, and the losing prize was a fried console.
They've been guilty of this practice for as long as they've been making electronics - I think it was the NES or the SNES that used a standard barrel jack and voltage, but reversed the polarity, with no circuit to protect the console.
Also - using Ethernet ports/cables for serial console access is not even evil, it's just stupid. Granted the switches/routers that do this are not exactly SOHO equipment, but you need to let students in a lab near that, and it's just asking for trouble.
The problem you're thinking of was caused by a Nyko third-party docking station with a faulty hardware implementation of USB-PD. While Nintendo's software implementation of USB-PD was flawed in some ways, it didn't cause this problem.
There is, sadly, no "standard" polarity for barrel power jacks. Center-positive and center-negative are both fairly common.
I stand corrected.
The dock however, still rejects non-Nintendo chargers. I have an MBP, a Thinkpad, an iPad, and a Switch - every device works with each other's (and third-party) chargers, except the Switch, when docked.
I think these usb-c pd (and other) issues were corrected in the oled switches and the docks that come with them but the fixes might not have been total.
This comment, specifically about some of the technical details of the video out stuff points to at least some changes were done in the new generation: https://www.reddit.com/r/UsbCHardware/comments/171c3mj/comme...
Someone can correct me on this as well. I've noticed that my switch will charge properly and quickly on their provided charger and a few other usb-c chargers I have but not certain other ones where it charges much slower. My theory has been that not all usb-c chargers are compatible with all voltages and that perhaps the switch's 'preferred fast-charge voltage' is somewhat uncommon (i think it's 17v?). Can anyone confirm this is how these things work? I just assume there is a negotiation where the charger says 'I can supply these voltages, and perhaps at these max amps, and the device choses one or none of them depending on it's needs.
Turns out I seem to be right about the negotiation: https://acroname.com/sites/default/files/shared/blog-basic-p...
I have the OLED model, and the accompanying dock refuses every single non-Nintendo charger I've tried. It literally tells me so on the screen.
Correct. Both devices present eachother with a list of supported modes, and negotiate the optimal one. If you try a fancy charger/powerbank with a screen, it will even present per-port current/voltage stats. Fun to observe the changing power draw on different devices.
Also if you plug another device to the same charger, it will trigger a re-negotiation, possibly end up in a different mode. LTT had a whole video on modding ancient consoles to use USB-PD, and the possible resulting issues.
https://linustechtips.com/topic/1470228-how-to-put-usb-c-pow...
Is there actually a standard about that? IIRC, recently things have drifted towards "center positive" as customary, but most adapters and devices have the little diagram about how they're wired for a reason.
I think everything I've seen that sony makes is centre negative. For their consoles, this only mattered for the psone and slim ps2.
Always seemed dumb to me, since you'd want the "ground" side to make contact first when inserting.
Audio and audio adjacent gear is nearly all center negative. Everything else is probably center positive, but you still have to check. No standards.
It wasn't mandatory but this is one of the main reasons Underwriter's Laboratory was started. The market needed a certification to say "yes, this things does what it's says it does and in a safe manner".
My dad owned a retail store in the 1970s and he distinctly remember being told "Lots of retailers won't carry a brand/item unless it has the 'UL' sticker on it."
https://en.wikipedia.org/wiki/UL_(safety_organization)
These days, any thing can have any sticker.
Decent chance there is a little ul logo on the cat water fountain.
Specially TÜV. Means near nothing.
Would this be comparable to a CE marking in Europe or more like a TÜV certification?
Sort of. UL and CE are certifications that declare a a product confirms to the standards defined by those groups. The main difference is that UL only focuses on product safety (i.e. Shock and mechanical hazards), while CE covers product safety plus environmental and health requirements.
UL is technically a global standard, but mostly used in USA/Canada. That's only two countries which already have laws governing what can and cannot be put in a consumer device.
I'm not as familiar with the history of CE, so I don't know why CE decided to cover environmental standards. But my speculation is that it's because CE covers ~30 countries with a larger range of laws.
You can find many fake "UL certified" products on Amazon, with a counterfeit UL logo and all. I think the problem is companies like Amazon not being held liable for what they sell.
If the hacker in you really hates this, then why insist on enforcing via means that restrict both vendor and user freedom (IP) instead of regulatory enforcement that exclusively targets vendors & leaves hackers free to hack?
Hackers want not just hack for themselves but share the product of their craft with other parties without going through fences.
Some do. Some don’t. The various uses of the term “hacking” do not universally require tech to be shared.
Long before RMS and other ideology-oriented types, people were adapting commercial tech to local purposes which the vendor didn’t intend and wouldn’t endorse.
Are you implying that regulation need put up more such fences than IP restrictions?
What would the regulatory enforcement look like?
If I develop a new connector, how do I get the regulator to start enforcing that everyone who uses it is following my standard?
What happens when I find a way to extend the specification in a safe, backwards compatible manner, but this conflicts with the standard as enforced by the regulator?
I’m not saying that these are insurmountable problems, but I feel like the regulator should focus on issues such as safety, and interoperability should be enforced by the company/consortium that develops the standard in the first place.
I think the thing that would let me stomach it is 1. Only legally mandating it for consumer devices / sold on the open market / commercial entities manufacturing it, or whatever the sensible way to structure that sentiment is, and 2. USB supports negotiating higher power settings, so you're not artificially limiting what's possible, just enforcing a safe worst case. If you need some USB connector that carries a kW you can theoretically still do that, you just have to not default to putting 100+ volts across the power pins the moment the thing is plugged in.
Re: 2: as you likely already know, this is a (major, pretty well thought out) part of newer usb standards. Unfortunately there’s no realistic way to backport this to older usb accessories. You can’t push an OTA update to the author’s wall wart.
I thought it worked fine with devices that had never been updated because you start from the very oldest spec and the lowest power and then both devices have to negotiate up?
I mean, sure you won't get better performance or power out of an old device but it's at least safe which is good enough for me, and for bonus points it's likely to work at some reduced level.
Yup, that's how it works, but you unfortunately can't retrofit out-of-spec devices like the author's to be within spec at scale.
That CE marking on the adapter is probably fake - how would it have passed certification while outputting a voltage 50% higher than spec?
All the CE marking indicates is that the product complies with European specifications. As far as I know, there are no EU/EEA specifications regarding the voltage of an USB port.
This may violate the USB spec and burn out phones, but it's not necessarily undeserving of the CE mark.
Turns out the new regulation for chargers and USB-C universal compatibility actually demands USB Power Delivery [1] support, but I guess it's too early to expect compliance with that.
Even without that, one would hope that part of the compliance tests includes actual compliance with whatever port specification is being used - I don't think any part of the USB spec allows for 7.5V, but I might be wrong. You just wouldn't expect, say, a portable battery to have a standard Schuko plug but output 330V while bearing the CE mark.
[1] https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CEL...
I quote from the regulation you linked:
Unless you can convince your local standards authority that the cat water fountain you bought is actually a weird looking portable speaker, I don't think this law applies her.
Unfortunate, really, because USB port really shouldn't use non-standard voltages. I'm sure there's some consumer protection law you could use in court to make the manufacturer pay for damage if they damaged your devices (there are a bunch of "a product should work like you would expect" types of laws sprinkled across consumer protection laws) but I don't think this violates CE standards.
Why not have efficient law and just sue large sums out of misbehaving manufacturers? Then you no longer punish the innocent.
If you can get a hold of the manufacturer, sure. But don’t count on Amazon or other marketplaces helping you with that: They’re not a reseller, as they’ll be happy to tell you; no liability! (And no responsibility for collecting customs payments either.)
Make Amazon sign up for compulsory customer protection insurance if they want to share the liability with somebody, otherwise it's all theirs.
OTOH we have tons of crazy non-usb transformers with all kinds of different plugs that might or might not plug into devices they're meant for.
It seems like somehow we're used to the idea that this particular plug will only work with the device that came with it, but we assume that USB-A is USB-A. I know I do.
This does make me wonder about the USB adapter I got with my dash cam. The documentation says all over the place to pretty please with sugar on top use our adapter. I'll have to stick a meter on it. [Edit: 5.1v.]
It's not just USB-A. You can assume your US-style wall outlet is ~115 V, 60 Hz. Cigarette lighters in cars are 12-ish V.
It's likely that they are trying to avoid support calls from users plugging into whatever USB port is on their dash that might not supply enough current, or is intended for media/AA/Carplay and forcing the camera into interfacing with the infotainment system.
USB does control use of their marks, it would probably be new law to prevent use of the physical arrangement.
Even if the connector itself were patented?
Mandatory how? Won't they just ship an uncertified adapter anyway?
Patent the connector and enforce it aggressively.
Yes, mandating this on consumer devices is quite sensible.
At least adhere to the spec to such a level that it won't destroy other hardware.
As long as you are free to modify the hardware or purchase specific non-conforming hardware I see no reason why this would be painful for a hacker.
For pure consumer products (whatever that means) safety standards are quite reasonable. There are already legal requirements in the USA that plugs have certain shapes controlling orientation, grounding, and expected voltage, as well as building code so that people don’t trip over long power cables. Food safety means you can pick up something at the shop and have some faith it won’t kill you.
This reduces cognitive overhead and doesn’t require you to be an expert in every domain.
It also doesn’t stop you from hacking your own stuff (mostly — please don’t mix ammonia and bleach!) and easily provides you a bunch of dependable items on which you can build your hacks. So you can (briefly!) supply 100 amps of 240V AC over USB A at home if you want, you just can’t sell it. You can make yoghurt at home and store it on the counter in the sun if you want, no problem.
I'd rather see specific laws against bad voltages on USB.
Isn't there already some law somewhere against things that are so obviously not what a consumer would expect?
It’s difficult enough to figure out which side goes up. I can’t worry about this as well.
I don't see how making sure that when something looks and quacks like a duck is actually a duck conflicts with the hacker ethos. The more standardized things are - the easier it is to build something. I love that I can take any combination of DIN parts and suppliers and know they will play nicely. Also there are a lot of standardized higher voltage connectors. this case is incompetence or negligence from manufacturer - and those are against everything hacking is about
Could that ever really help? The vast majority of devices do this correctly, even from the cheap, nameless manufacturers cranking stuff out for sale in marginal retail outlets. I suspect even if there was some sort of legislation or accountability that this particular culprit would have pulled the same stunt. They'd have relied on the fact that it's difficult to track down who manufactured the item, that there'd be little incentive to do so by anyone who became aware of it, and that they'd be safe from repercussions across some international border.
I think your initial instinct is right.
Why? I'm not sure what being a hacker has to do with it, but this sort of thing would benefit everybody and harm nobody except for the negligent.