Home Made Energy: Renewable Energy For The Rest Of Us

Google Chrome's ad blocking is unfortunately not as good as Firefox's, so occasionally I see ads on the web. I generally ignore them, but I do click on the occasional one either because it's interesting or because I don't like it (since my clicks cost them money!). On The Straight Dope today I came across "Home Made Energy: Renewable Energy For The Rest Of Us". This company sells a guide which purports to tell you how to run your house purely off wind and solar power for less than $200. I'm skeptical.

First of all, electricity costs something like twenty cents a kilowatt-hour, and they're talking about saving some hundreds of dollars a month. So let's say $100 a month - that's 500 kWh a month, or about 700 W. More credible sources cite about $10/watt for solar power, or $7000 for such a system. So is this guide really nonsense? Not necessarily.

Solar cells are expensive to make - think of making microchips the size of a solar panel. Not quite fair - they don't need the density of components, but they do need the extremely pure silicon and the high-vacuum manufacturing - but a sign that there's a good reason they aren't cheap. A solar system also needs some electronics for converting electricity to a useful voltage, and some way to deal with the fact that the amount of solar power varies in a way that has little to do with the demand for solar power.

I think a reasonable guide of this sort might be able to point readers at where to scavenge used or discarded parts for all of the above. The power electronics are definitely something a clever amateur could build out of scavenged parts (at some risk to their life!), but I think it would take incredible luck to obtain solar cells that worked and were that cheap. It's also possible that a guidebook could explain how to take advantages of government programs to encourage renewable energy, perhaps obtaining discounts or tax credits on the hardware.

The biggest way governments or energy companies could encourage renewable energy of this sort is to eliminate the need for energy storage. Since most of the people who'd be considering this sort of project already have a connection to the electricity grid, if the utility company is willing, you could simply sell them electricity whenever you make more than you need, and buy electricity when you need more than you buy.

Ideally, as a homeowner, you'd get paid the same price for the electricity you sell as you pay for the electricity you buy. Unfortunately, this is often not the case. There are good reasons electric companies would pay less for electricity they get from homeowners than they charge homeowners; for one thing, all those wires to distribute the electricity aren't free. More subtly, it's really difficult to store electricity on the scale that utility companies deal with, so they have to work quite hard to make sure that the amount of electricity fed into the grid in any given second exactly matches the electricity drawn out of it in that second. Having countless small generators outside their control is going to make that job much more difficult.

That said, persuading companies to act in a way that costs them money but benefits all people is a natural role of government. Paper mills have waste treatment systems not out of the goodness of their nonexistent hearts but because the government charges them massive fines or shuts them down if their effluent is too toxic. So if the government were to force (or fund) companies to pay consumers the same price for electricity they generate as they charge for electricity they use, suddenly a lot more small-scale power generation projects would become cost-effective.

Incidentally, another approach for storing solar power for when you need it is to let it charge your solar car (or plug-in hybrid). This has even been proposed as a scheme to help load-levelling in the power grid.

Anyway, the upshot of all this is that I think that yes, it is occasionally possible to scrounge together a cheap renewable energy system. But I suspect that the claimed $200 is only possible with in the best possible case - scavenged parts, government subsidies, living in a sunny desert, having a cooperative utility company, and incredible luck.

Full post

Artificial gravity

Science fiction is full of spaceships zipping around the galaxy, and almost all of them seem to have some kind of artificial gravity on board. For TV shows and movies, this is obviously a practical necessity, and even for written science fiction, freefall is such an alien condition that it would be a real challenge to write realistically about it. So science-fictional spaceships generally have some sort of artificial gravity. But will real spaceships?

Current spacecraft certainly don't have any kind of artificial gravity. Early craft, like the Mercury, or Apollo, were so cramped I think it must have been a blessing to be able to use every available cubic centimeter. Soyuz, still in use, is not much bigger, and the Space Shuttle is mighty cramped too. In any case, when people are spending only a few days at a time in an environment, they can put up with a great deal. But in the longer term, it does appear that freefall may cause some health issues: even with two hours of exercise a day, astronauts seem to suffer from bone demineralization, muscle loss, and cardiovascular problems, and there also seem to be some peculiar immune system effects (though apparently cockroaches adapt just fine). For the International Space Station, astronauts exercise and don't stay up too long. But for something like a mission to Mars, it would certainly be nice to provide some sort of artificial gravity.

Shows like Star Trek and Battlestar Galactica posit some kind of "gravity generator", but this is pretty much the same technology as antigravity (and maybe reactionless drives). This basically requires wild departures from the laws of physics as we know them, so I'll leave them and other "magic" systems aside.

We do know one way to produce something very like gravity: rotation. If you're in a wheel that's spinning, centrifugal force feels very like gravity, pushing you outwards against the wall. There is the Coriolis force, which gives moving objects a push at right angles to their direction of motion; it turns out that if you spin humans at more than about 10 revolutions per minute and they try to move around, the Coriolis force causes severe disorientation and nausea. But with a wheel of 20 m diameter you can get a full Earth gravity by rotating at that top speed. (Incidentally, that 10 RPM is with slow and careful acclimatization, so it would be preferable to limit it to 3 RPM or less, to which most people can become acclimatized; that triples the needed diameter.)

Science fiction contains a number of examples of spaceships with rotating sections. This doesn't violate any laws of physics, but it seems to me to present some rather serious engineering difficulties. The first is, how do you connect the rotating section to the non-rotating section? I can imagine some rolling ball-bearing joint, though the vacuum of space does tend to make things stick together, and rolling joints generally require constant lubrication (and frequent maintenance), which is going to be hard to do in a vacuum. There's also the issue that, given the size of the moving parts, if there's any kind of problem with the joint, the ship will probably tear itself apart.

If you want people to be able to easily move back and forth between the sections, you'll need to pressurize the whole thing, which means that you need this rolling joint to also be airtight. Techniques for making rolling seals range from the simple to the exotic (stuffing boxes, labyrinth seals, ferrofluid seals) but they're all tricky, and for the kind of long-term operation that motivates artificial gravity, you would need exceedingly low leakage and very high reliability. You could avoid this, and vacuum joint issues, by having a spinning wheel inside a non-spinning airtight shell, but mass will always be at a premium, and remember the wheel has to be quite large.

More serious as a problem, it seems to me, is the issue of cable wrapping. Think of it this way: how do you connect the cables and hoses - power, communications, air, water - from the rotating segment to the stationary segment? If you just connect them directly, they will immediately get twisted into a bundle and then break (radio telescopes solve this problem by having only a limited range of rotation - 720 degrees for Arecibo, for example - but this is obviously no use here). In principle you could do something with a ring on one part of the ship and a brush that slides around it on the other, but remember you have to have a separate ring for every connection you want to make, and this sort of sliding connection is one of the trickiest parts of an electric motor to build. If you were feeling particularly devious you could transmit power to the rolling part of the ship by using a generator to draw power from the rotation itself, and if you had to you could avoid other electrical connections by transmitting all your data (control, telemetry, navigation, et cetera) wirelessly from one part of the ship to the other. Water hoses are going to be a problem any way you cut it.

I think my preferred solution is to roll the whole ship. This does make it a pain to do things like fix a telescope on one point, or keep your communications dish pointed at the Earth, but for a ship that moves around all your exterior sensors need to be steerable anyway, so this doesn't seem like it is necessarily a problem.

Whether you roll the whole ship or just have a rotating section, the angular momentum bound up in the rolling section will make maneuvering the ship a nightmare. Not impossible, especially under computer control, but expensive in terms of fuel, liable to cause tumbling, and just generally a bad idea. So stopping the rotation when you need to maneuver seems sensible; maneuvers will probably be rare and planned well in advance. This does mean you need a not-too-expensive way to start and stop the rotation. A pair of counterrotating sections, or a flywheel, would let you do it without using up any reaction mass, just energy, but there's a very great deal of angular momentum to store, so it may be easier to simply use maneuvering jets.

For a space station, many of the same issues apply; rolling the whole station still seems like the simplest and most reliable approach. The cost of starting and stopping isn't very important, since presumably the station will be spun up once built and keep spinning indefinitely. Docking with such a station might be a challenge, though. Docking at the rim requires spacecraft to essentially "hover" under a gravity of thrust before they can latch on. Docking at or near the hub could be done by just matching the ship's roll to the station. Unloading would then have to take place in microgravity (though with the Coriolis force). Ships, once docked, would presumably be moved to berths off the station's axis to make room for more landings. Departures should probably be along the axis as well for the sake of station stability, although in principle a ship could just "drop" off the station rim at the right moment and steal a nice initial kick from station rotation. Whether or not ships do this, the station will need to be able to shift substantial amounts of mass around its rim to keep itself balanced; large movements of mass aboard station will need to be arranged ahead of time with station control.

In summary, artificial gravity is possible and probably desirable for long-term stays in space, but it won't be simple.

Full post


I have been doing some X-ray astronomy. In optical astronomy, spectroscopy is a very powerful tool: by looking for emission and absorption lines you can identify the elements present in a gas (helium was discovered this way, for example); the shapes of the lines can tell you about temperatures and velocities in the object, and the shape of the broadband spectrum can also tell you about the temperature and conditions in the emission region. In X-rays, things are more difficult, for a number of reasons. Unfortunately, lines are much rarer (at least when looking at neutron stars), telescope time is very scarce (since the telescopes must be in space), and there's always a shortage of photons. But X-ray spectroscopy still has the potential to tell you about temperatures, sizes, and compositions of neutron stars (for example). So that's what I've been working on.

The standard tool for X-ray spectroscopy is xspec, one of those pieces of scientific software that's had a great deal of cleverness built into it, very little of which has gone into making it easy to use. It could be worse - at least its interface is not stuck in the FORTRAN era, in fact it has a tcl interpreter built in (yack) - but its plotting in particular is pretty rudimentary, tending to produce monstrosities like this:

The worst part of this graph, apart from the fact that it's practically unreadable even for those with normal color vision, is that it's quite deceptive. It looks as if there's clear evidence for a bend in the spectrum just below 2 keV. But look at the plot below the jump for comparison.

The data points on this plot are identical, but drawing a single power-law through the whole thing makes it look like the data's completely straight. Ordinarily one would choose between the models based on the statistics, but we have so few photons (about 8000) that both models are perfectly adequate fits to the data. I suppose Occam's Razor tells me I should pick the simpler model, though whether this should be the simple but not particularly physical power-law or the physically plausible but more complicated power-law plus neutron-star atmosphere model isn't entirely clear to me.

I'll keep thinking about how to improve the graphics, but the problem is I have five data sets in which each data point has its own vertical and horizontal error bars, and the model gives slightly different predictions for each data set (since they use different instruments with slightly different responses). The plotting tools provided by xspec are also not very flexible (and I haven't found a good way to export the relevant data so I can use my own plotting tools).

What really bothered me, though, was how strongly each graph suggests its own interpretation. It would have been easy to look at one and assume that it told the whole story.

Full post


I recently watched Carl Sagan's Cosmos (available online by request of his co-creator and widow Ann Druyan). It's a very effective piece of science popularization, and I'm sorry I passed up a chance to be introduced to Carl Sagan, but one aspect that stands out is how he reiterates the theme of "if we do not destroy ourselves". It sounds a bit odd now: while climate change and other coming global ecological crises are alarming, somehow they don't leave me feeling like we are risking extinction of the human race. Cosmos, though, was made in 1980, when two great superpowers were threatening exactly that. The danger of global nuclear war doesn't feel so immediate, but I think it is still very real, and so I am glad when I hear President Obama talking about nuclear disarmament.

In any case, the most fraught years of the Cold War motivated a number of people - including Carl Sagan - to make movies describing the likely outcome of a global thermonuclear war. Several of these powerful, albeit harrowing, movies are available online in their entirety.

When the Wind Blows. The British government issued a series of short films and brochures on how to respond to a nuclear attack. The makers of this film set up an ordinary retired couple who attempt to follow these instructions. It will come as no surprise that taking their doors of their hinges and building a nest of blankets don't do them much good; the film follows them right to the bitter end.

The Day After. Set (and filmed) in Lawrence Kansas, this film starts with some ordinary Americans living normal small-town lives, and simply supposes that a cold-war dispute over Berlin escalates until the two sides do what they've been threatening to. Small-town life has left the main characters better prepared to survive the immediate consequences of the devastation, but in the weeks after the exchange we see the effect of thousands of sick, dying, and desperate people converging on a hospital that could never have handled them all even with power, supplies, and healthy staff.

Threads. Perhaps the most harrowing of the three, this film is set in Sheffield, and follows the main characters - those who survive - past the initial deaths and civil disorder into the years that follow, in which what's left of the government attempts to keep some survivors healthy enough to eke out what crops they can in spite of nuclear winter. The name comes from the idea that the society we know is held together by a network of "threads" of personal connection, and that such a holocaust shreds the fabric, leaving no society we would recognize. Britain cannot even return to its state - even once enough people have died - as of the middle ages, since its forests are no longer available as fuel and (even leaving aside contamination and nuclear winter) possibly much of its soil is no longer suitable for cultivation without fertilizers. The social effects of the brutal measures necessary for survival - both by the government and by people trying to survive - well, I will not attempt to capture the grim picture the film paints, but it is wholly believable.

Incidentally, the consequences of nuclear war depicted in Threads are based on what British civil defense planners were told what to expect by the Americans. Apparently one of them let slip that they were warned that the bombs that would have so devastated Sheffield would have been launched by Americans hoping to deny the UK to Soviet forces.

Lest you think that these films exaggerate the horror of a nuclear attack, you can watch or read Barefoot Gen, about some children who survive the atomic bombing of Hiroshima. The message of this movie is ultimately one of hope, unlike the previous three, but the images of the attack itself, based on the author's experience as a Hiroshima survivor, are far more horrific than any shown in the previous three movies.

My point is this: we, scientists and engineers, soldiers and workers and politicians, sweated for forty years to arrange this fate for ourselves. Almost all those missiles still exist, and are still pointed at the same victims now. I hope the political situation has changed to make it unlikely that they will be used (though I note that Threads begins with a conflict in Iran). But we shouldn't forget about the destruction we worked so hard on, and we should think about what it says about us that we planned this.

Full post

ssh control socket: almost great

ssh is an essential tool on a unix network. I use it to log in to machines remotely, control VNC desktops, act as a VPN (SOCKS proxy), synchronize source code (with git and svn), serve music and movies across a wireless network (with sshfs), and transfer hundreds of gigabytes of pulsar data (with rsync). So the ControlMaster feature seemed like a great idea: automatically reuse one ssh connection for as many logins and file transfers as necessary. But it won't quite do what I want.

I see two major failings:

First of all, the connection dies when the initial ssh process dies. So if you log in, creating a master socket, and do something, then log in in a subsidiary socket, you get your second connection through the ControlMaster magic. But if you then log out of the first connection, the ssh process keeps running. If you kill it (say by closing the window it's in), all the subsidiary connections die. What this means is that if you try to use opportunistic connection reuse, and you have several terminal windows open on the same host, for the most part you can just close the one window. But there's one window that, if you close it, will take down all the others with it. Yuck.

You can kind of work around this by, instead of using opportunistic connection sharing, explicitly starting a master connection with "ssh -MfN host", which drops the ssh process into the background as soon as it's connected. Unfortunately, this means you have a quasi-zombie ssh process hanging around indefinitely. So I'm not sold on it either. (But if you're going to do it, using autossh might help.)

The second, more serious, problem I have with ControlMaster is that it doesn't let subsidiary ssh connections open new port forwardings. I use port forwardings a lot, for example to forward VNC connections to machines I can't see from the outside world. If opportunistic connection sharing causes those to fail, or worse, fail sometimes, it's going to be a problem. A shame really, it's such a sensible idea.

Edit as of 2013 August 29: OpenSSH now has the ControlPersist option, which, in combination with ControlMaster and ControlPath, can be used to make the controlling SSH process background itself. You still can't (as far as I can tell) add new forwardings later on, but at least the problem of having the first connection be magical has gone away. I use this combination for one particular machine that accepts only password authentication (don't get me started) but that I use only as a gateway machine. Now the first time I try to connect through it I get prompted for a password but later connections just reuse the link. And because all it's doing is forward connections using the -W option, I don't care that I can't add port forwardings. 

Full post