As we all know, AC won the “War of the Currents”. The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there’s the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I’m willing to go into the maths, if that’s relevant to the discussion.

  • BearOfaTime@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    Oh wow, thanks for the detailed writeup. It’s a little above my pay grade (condensers used as localized generators? Wow, what an idea. They must be huge).

    Guess it’s time to find an Intro to Powergrids from The Teaching Company

    • gandalf_der_12te@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I’ll give you a short introduction to the power grid (btw. it’s called “stromnetz” (electricity network) in german). The power grid has many “levels”, where each level represents a network of cables that transport current at a given, specific voltage. For example, you might have one 220kV level, and then a 5kV level, and a 230V end-consumer level.

      Between these levels, there have to be translations. These are “transformers” today, transforming high-level AC into lower-level AC or the other way around. For AC networks, they are basically a ring of iron and a few coils. However, for DC networks, other transformers exists, such as Buck/Boost converter.

      My question basically is: is there anyone who can give me experimental data on how well DC networks would work in practice? Personal experience is enough, it doesn’t have to be super-detailed reports.

      • SomeoneSomewhere@lemmy.nz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 months ago

        I’m not sure there are any power grids past the tens-of-megawatt range that aren’t just a 2/3/4 terminal HVDC link.

        Railway DC supplies usually just have fat rectifiers and transformers from the AC mains to supply fault current/clearing and stability.

        Ships are where I would expect to start seeing them arrive, or aircraft.

        Almost all land-based standalone DC networks (again, not few-terminal HVDC links) are heavily battery backed and run at battery voltage - that’s not practical once you leave one property.

        I’m sure there are some pretty detailed reports and simulations, though. A reduction in cost of multi-kV converters and DC circuit breakers is essential.