|
11/22 |
2008/3/3-6 [Science/GlobalWarming] UID:49317 Activity:moderate |
3/3 Photos of weather stations...badly placed weather stations. http://www.norcalblogs.com/watts/weather_stations \_ Oh, so global warming is just caused by concrete? I'm so relieved! All those dumb scientists ought to read angry guy's blog! FYI, you may wish to read realclimate's "no man is an urban heat island" entry: http://www.realclimate.org/index.php/archives/2007/07/no-man-is-an-urban-heat-island http://tinyurl.com/ys6hg6 (realclimate.org) \_ You global warming evangelists are scary. Everything is global warming this, global warming that. These are pictures of *bad science*, and your kneejerk response is like a young-earth creationist's immediate response to dinosaur bones. \_ They're pictures of bad data points, not bad science. \_ Collecting data is part of science, or didn't you do any labs? \_ This is an amazing comment. I would go so far as to say getting good data is the hard part of empirical science. -- ilyas \_ ...which is why the climate models use thousands of data points from ground, atmospheric, and oceanic stations, and do what they can to correct for errors. \_ And if a substantial portion of the ground stations have problems....? \_ Will nobody read the damn realclimate article? \_ I don't think any of the "mistaken assumptions" apply. And? \_ Agreed, case in point, the free market is already solving global warming. Look at Plantos incorporated. They trap CO2 by dumping iron dust into the ocean. \_ No one could have anticipated that dumping iron dust into the ocean could have caused [even bigger problem] \_ My response was to the pompous douchebag that runs that blog, not you in particular. You should check out the realclimate post on UHIs, though. You know, follow the link, read the information within, just like I did with that sad little picture site. |
11/22 |
|
www.norcalblogs.com/watts/weather_stations -> www.norcalblogs.com/watts/weather_stations/ Comments October 01, 2007 How not to measure temperature, part 32 A common theme with official climate stations of record is their placement with city and county fire stations. An observer is needed to transfer the data from the thermometer to the B91 form sent to NCDC every month. Unfortunately, fire stations are often not good places to measure temperature due to the amount of concrete and equipment around them. Their placement is to better serve the city population, putting them in the middle of UHI. JPG While this placement "is" over grass, the narrow grass strip is also within feet of parking, a major thoroughfare, located downtown near high rise buildings. This is not an ideal place at all to measure temperature, yet it is the official USHCN climate station of record. As shown by the graph provided by NASA GISS, there appears to be a step bias introduced in the early 1980's. It is not uncommon to see such step biases introduced when the MMTS replaces the traditional Stevenson Screen shelter and mercury thermometers, since cable issues often force the MMTS to be closer to buildings. over 70% of our USHCN climate network uses the MMTS electronic thermometer system. While fire stations do give the appearance of a regular warm body to record the temperature and send it in to NCDC, sometimes it doesn't always go as planned. For example, this B91 form from Bartow looks a bit like a workbook assignment from school. B91 form provided by the Marysville observer (PDF format). Please don't get the idea that I'm putting down the hard work of the amateur climate observers, they perform a valuable and much needed service. The point here is quality control issues and missing data. These B91 forms are just a couple of random samples, more on that missing data issue soon. Comments September 20, 2007 How not to measure temperature, part 31 It's been awhile since I updated this series, and its not for lack of material. But I got busy with the UCAR conference, publishing a slide show, and other things. his latest survey in Titusville, FL near Cape Canaveral and KSC. I'd like to point out that Don has traveled further and surveyed more stations in the USA than anyone. He wrote this in his email to me: "On your scale of 1 to 5, this is an 8 Peace, Don Kostuch" Ok in the past we have seen stations on rooftops, at sewage treatment plants, over concrete, next to air conditioners, next to diesel generators, with nearby parking, excessive nighttime humidity, and at non-standard observing heights. Ok here is your chance, show me the equations to untangle Titusville's temperature record from microsite bias. but they are more formal and largely based on WMO report 488, which contains some interesting quotes that are not present in later reports. The online reports also refer to the report below, which unfortunately I was not able to locate either online nor in our library. World Meteorological Organization, 1993a: Siting and Exposure of Meteorological Instruments (J. These specs are worth a read, because they show that quite a lot of thought and analysis went info choosing the specs. org The picture illustrates how human activity can spring up around a station. The MMTS electronic temperature sensor is shown next to a lean-to used for rafting gear storage. I presume the life preserver is placed next to the sensor as a reminder that we may need it in case of catastrophic sea level rise. The metal ore cart full of stones is a nice touch, and makes a perfect high mass IR radiative heatsource to keep the overnight lows a bit more "comfy". org Once again, we have a climate station of record in the middle of a parking area, near buildings, and directly in the middle of regular human activity. One of the downsides to the NWS COOP modernization program started in the 1980's and continuing today is the MMTS unit itself. It requires a cable, and that cable has be be buried to be brought into the domicile containing the electronic readout. As anyone knows, especially rabbits, digging short holes is far easier than digging long ones. So its far easier and less time consuming to dig a short trench and place the sensor nearer the building. This proximity bias seems to have been repeated regularly when the MMTS system has replaced the traditional Stevenson Screen and Mercury Max-min thermometers. There's a reason that NOAA specifies that temperature sensors should be a minimum of 100 feet away from buildings, concrete, and asphalt which may introduce biases into the reading. What we don't know is why there has been such an apparent regular failure to adhere to such specifications. Comments (34) August 04, 2007 How not to measure temperature, part 27 - Basketball anyone? jpg This is the climatological station of record for Odessa, Washington. It is at the residence of a COOP weather observer administered by NOAA. In addition to the proximity to the house and the asphalt being less than the 100 foot published NOAA standard, we have a basketball goal nearby. I don't know if any studies or standards exist that describe what if any effects having the MMTS sensor whacked by errant basketballs might have. Speaking from my own electronic design experience though, transient and numerous G forces applied to electronic sensors don't generally allow for sustained accuracy and reliability. Comments August 01, 2007 How not to measure Temperature, part 26 - counting A/C units There's been some recent discussion about how only rural stations have been used in the NASA GISS analysis, and those rural stations are qualified by looking at night time DOD satellite photos, and doing a count of visible streetlights within a radius to quantify UHI potential or lack thereof. Happy Camp, California, population 2182, an old gold mining and logging town located in the rugged NW corner of the state, and about 100+ miles from any major city. NOAA MMS metadata website reports data back to 1931 with 3 small distance station moves, and no changes to equipment. It looked like a good candidate to look at for a lights=0 survey. The weather station is located at the Ranger Station: Happy Camp Ranger Station - USHCN climate station of record But what you can get from satellite images and databases can't really prepare you for what you may find. I "expected" to find an old classic Stevenson Screen, probably near the Ranger Station office. But what I didn't expect to find was a "rural" station swimming in a sea of exhaust from 22 air conditioning units within 100 feet of the Stevenson Screen. org In addition to the 22 A/C units within 100 feet there are other biases too. Granted, not all 22 may be introducing a bias, but since NASA's Dr. James Hansen counts lights near stations, to asess UHI magnitude, we can count A/C's. If each A/C unit was 2000 BTU, that would be 22x2000=44,000 BTU of waste heat dumped within 100 feet of the Stevenson Screen where the thermometer is located. for other biases, positive and negative there's the buildings, the windows, the shade trees, the wind sheltering, and the lawn sprinkler. There's also the big parking lot to the southwest, and the Stevenson Screen is at the top of a slope and there's a parking lot downslope. When I mentioned to the site curator about the A/C units she said "hmm, I never thought about that" but then added, "But I can tell you that when we water the lawn, my high temps are lower". I asked the curator what the prevailing wind direction was, and she said from the "south to southwest usually". jpg Unedited NASA GISS raw data plot for Happy Camp RS So one has to wonder, with all the observed microsite biases, what is the data really showing? One also wonders what the plot might look like if this station was better sited. And if a lights = 0 station like this one, far removed from urbanization, has so many such micro-site biases, could others have similar problems? It looks like more hands-on site surveys will have to be done to determine the true value of lights=0 USHCN sites. org volunteer Don Kostuch is the Detroit Lakes, MN USHCN climate station of record. The Stevenson Screen is sinking into the swamp and the MMTS sensor is kept at a comfortable temperatur... |
www.realclimate.org/index.php/archives/2007/07/no-man-is-an-urban-heat-island assault upon the meteorological station data that underpin some conclusions about recent warming trends. Curiously enough, it comes just as the IPCC AR4 report declared that the recent warming trends are "unequivocal", and when even Richard Lindzen has accepted that globe has in fact warmed over the last century. focus of attention is the placement of the temperature sensors and other potential 'micro-site' effects that might influence the readings. There is a possibility that these effects may change over time, putting in artifacts or jumps in the record. This is slightly different from the more often discussed 'Urban Heat Island' effect which is a function of the wider area (and so could be present even in a perfectly set up urban station). UHI effects will generally lead to long term trends in an affected station (relative to a rural counterpart), whereas micro-site changes could lead to jumps in the record (of any sign) - some of which can be very difficult to detect in the data after the fact. There is nothing wrong with increasing the meta-data for observing stations (unless it leads to harassment of volunteers). However, in the new found enthusiasm for digital photography, many of the participants in this effort seem to have leaped to some very dubious conclusions that appear to be rooted in fundamental misunderstandings of the state of the science. Let's examine some of those apparent assumptions: Mistaken Assumption No. UHI effects have been documented in city environments worldwide and show that as cities become increasingly urbanised, increasing energy use, reductions in surface water (and evaporation) and increased concrete etc. tend to lead to warmer conditions than in nearby more rural areas. GISTEMP uses satellite-derived night light observations to classify stations as rural and urban and corrects the urban stations so that they match the trends from the rural stations before gridding the data. Other techniques (such as correcting for population growth) have also been used. How much UHI contamination remains in the global mean temperatures has been tested in papers such as Parker (2005, 2006) which found there was no effective difference in global trends if one segregates the data between windy and calm days. This makes sense because UHI effects are stronger on calm days (where there is less mixing with the wider environment), and so if an increasing UHI effect was changing the trend, one would expect stronger trends on calm days and that is not seen. Another convincing argument is that the regional trends seen simply do not resemble patterns of urbanisation, with the largest trends in the sparsely populated higher latitudes. Since scientists started thinking about climate trends, concerns have been raised about the continuity of records - whether they are met. The danger of mistakenly interpreting jumps due to measurement discontinuities as climate trends is well known. new version of the NOAA product), others can be adjusted using known information (such as biases introduced because changes in the time of observations or moving a station). However, there are undoubtedly undetected jumps remaining in the records but without the meta-data or an overlap with a nearby unaffected station to compare to, these changes are unlikely to be fixable. reference network which is much more closely monitored than the volunteer network, to see whether the large scale changes from this network and from the other stations match. Any mismatch will indicate the likely magnitude of differences due to undetected changes. It's worth noting that these kinds of comparisons work because of large distance over which the monthly temperature anomalies correlate. That is to say, that if a station in Tennessee has a particular warm or cool month, it is likely that temperatures in New Jersey say, also had a similar anomaly. The Elusive Absolute Surface Temperature" to understand why we care about the anomalies rather than the absolute values. These are the products from CRU in the UK and NASA GISS in New York. Both CRU and GISS produce gridded products, using different methodologies, starting from raw data from NWSs around the world. There are about three people involved in doing the GISTEMP analysis and they spend a couple of days a month on it. The idea that they are in any position to personally monitor the health of the observing network is laughable. persists that climate models are somehow built on the surface temperature records, and that any adjustment to those records will change the model projections for the future. This probably stems from a misunderstanding of the notion of a physical model as opposed to statistical model. A statistical model of temperature might for instance calculate a match between known forcings and the station data and then attempt to make a forecast based on the change in projected forcings. In such a case, the projection would be affected by any adjustment to the training data. However, the climate models used in the IPCC forecasts are not statistical, but are physical in nature. They are self-consistent descriptions of the whole system whose inputs are only the boundary conditions and the changes in external forces (such as the solar constant, the orbit, or greenhouse gases). They do not assimilate the surface data, nor are they initiallised from it. Instead, the model results for, say, the mean climate, or the change in recent decades or the seasonal cycle or response to El Nio events, are compared to the equivalent analyses in the gridded observations. Mismatches can help identify problems in the models, and are used to track improvements to the model physics. However, it is generally not possible to 'tune' the models to fit very specific bits of the surface data and the evidence for that is the remaining (significant) offsets in average surface temperatures in the observations and the models. There is also no attempt to tweak the models in order to get better matches to regional trends in temperature. That there is so little redundancy that throwing out a few dodgy met. stations will seriously affect the mean, and that evidence for global warming is exclusively tied to the land station data. It has been estimated that the mean anomaly in the Northern hemisphere at the monthly scale only has around 60 degrees of freedom - that is, 60 well-place stations would be sufficient to give a reasonable estimate of the large scale month to month changes. Currently, although they are not necessarily ideally placed, there are thousands of stations - many times more than would be theoretically necessary. Since many of the participants in the latest effort appear to really want this assumption to be true, pointing out that it doesn't really follow might be a disincentive, but hopefully they won't let that detail damp their enthusiasm... As stated above, more information is always useful, but knowing what to do about potentially problematic sitings is tricky. One would really like to know when a problem first arose for instance - something that isn't clear from a photograph from today. If the station is moved now, there will be another potential artifact in the record. An argument could certainly be made that continuity of a series is more important for long term monitoring. Debates over just how to compensate for it began seriously as early as 1967. After much debate the issue was pretty much settled, in terms of figuring out how to compensate for the urban effect and detecting a warming trend anyway, by 1990. Personally I got convinced that warming was underway in the late 1990s after borehole measurements in rocks around the world, far away from civilization, showed unmistakable evidence of warming over the past century... if you log temperature down the hole, you find that extra heat has been seeping down from the surface. I think any scientists not convinced by that would have been satisfied by the measurements of the oceans in the early 2000s that showed definitively that heat is seeping down there too. After all, most of the excess energy from any radiation imbalance will wind up in the oceans, and the top layers are ... |
tinyurl.com/ys6hg6 -> www.realclimate.org/index.php/archives/2007/07/no-man-is-an-urban-heat-island/ assault upon the meteorological station data that underpin some conclusions about recent warming trends. Curiously enough, it comes just as the IPCC AR4 report declared that the recent warming trends are "unequivocal", and when even Richard Lindzen has accepted that globe has in fact warmed over the last century. focus of attention is the placement of the temperature sensors and other potential 'micro-site' effects that might influence the readings. There is a possibility that these effects may change over time, putting in artifacts or jumps in the record. This is slightly different from the more often discussed 'Urban Heat Island' effect which is a function of the wider area (and so could be present even in a perfectly set up urban station). UHI effects will generally lead to long term trends in an affected station (relative to a rural counterpart), whereas micro-site changes could lead to jumps in the record (of any sign) - some of which can be very difficult to detect in the data after the fact. There is nothing wrong with increasing the meta-data for observing stations (unless it leads to harassment of volunteers). However, in the new found enthusiasm for digital photography, many of the participants in this effort seem to have leaped to some very dubious conclusions that appear to be rooted in fundamental misunderstandings of the state of the science. Let's examine some of those apparent assumptions: Mistaken Assumption No. UHI effects have been documented in city environments worldwide and show that as cities become increasingly urbanised, increasing energy use, reductions in surface water (and evaporation) and increased concrete etc. tend to lead to warmer conditions than in nearby more rural areas. GISTEMP uses satellite-derived night light observations to classify stations as rural and urban and corrects the urban stations so that they match the trends from the rural stations before gridding the data. Other techniques (such as correcting for population growth) have also been used. How much UHI contamination remains in the global mean temperatures has been tested in papers such as Parker (2005, 2006) which found there was no effective difference in global trends if one segregates the data between windy and calm days. This makes sense because UHI effects are stronger on calm days (where there is less mixing with the wider environment), and so if an increasing UHI effect was changing the trend, one would expect stronger trends on calm days and that is not seen. Another convincing argument is that the regional trends seen simply do not resemble patterns of urbanisation, with the largest trends in the sparsely populated higher latitudes. Since scientists started thinking about climate trends, concerns have been raised about the continuity of records - whether they are met. The danger of mistakenly interpreting jumps due to measurement discontinuities as climate trends is well known. new version of the NOAA product), others can be adjusted using known information (such as biases introduced because changes in the time of observations or moving a station). However, there are undoubtedly undetected jumps remaining in the records but without the meta-data or an overlap with a nearby unaffected station to compare to, these changes are unlikely to be fixable. reference network which is much more closely monitored than the volunteer network, to see whether the large scale changes from this network and from the other stations match. Any mismatch will indicate the likely magnitude of differences due to undetected changes. It's worth noting that these kinds of comparisons work because of large distance over which the monthly temperature anomalies correlate. That is to say, that if a station in Tennessee has a particular warm or cool month, it is likely that temperatures in New Jersey say, also had a similar anomaly. The Elusive Absolute Surface Temperature" to understand why we care about the anomalies rather than the absolute values. These are the products from CRU in the UK and NASA GISS in New York. Both CRU and GISS produce gridded products, using different methodologies, starting from raw data from NWSs around the world. There are about three people involved in doing the GISTEMP analysis and they spend a couple of days a month on it. The idea that they are in any position to personally monitor the health of the observing network is laughable. persists that climate models are somehow built on the surface temperature records, and that any adjustment to those records will change the model projections for the future. This probably stems from a misunderstanding of the notion of a physical model as opposed to statistical model. A statistical model of temperature might for instance calculate a match between known forcings and the station data and then attempt to make a forecast based on the change in projected forcings. In such a case, the projection would be affected by any adjustment to the training data. However, the climate models used in the IPCC forecasts are not statistical, but are physical in nature. They are self-consistent descriptions of the whole system whose inputs are only the boundary conditions and the changes in external forces (such as the solar constant, the orbit, or greenhouse gases). They do not assimilate the surface data, nor are they initiallised from it. Instead, the model results for, say, the mean climate, or the change in recent decades or the seasonal cycle or response to El Nio events, are compared to the equivalent analyses in the gridded observations. Mismatches can help identify problems in the models, and are used to track improvements to the model physics. However, it is generally not possible to 'tune' the models to fit very specific bits of the surface data and the evidence for that is the remaining (significant) offsets in average surface temperatures in the observations and the models. There is also no attempt to tweak the models in order to get better matches to regional trends in temperature. That there is so little redundancy that throwing out a few dodgy met. stations will seriously affect the mean, and that evidence for global warming is exclusively tied to the land station data. It has been estimated that the mean anomaly in the Northern hemisphere at the monthly scale only has around 60 degrees of freedom - that is, 60 well-place stations would be sufficient to give a reasonable estimate of the large scale month to month changes. Currently, although they are not necessarily ideally placed, there are thousands of stations - many times more than would be theoretically necessary. Since many of the participants in the latest effort appear to really want this assumption to be true, pointing out that it doesn't really follow might be a disincentive, but hopefully they won't let that detail damp their enthusiasm... As stated above, more information is always useful, but knowing what to do about potentially problematic sitings is tricky. One would really like to know when a problem first arose for instance - something that isn't clear from a photograph from today. If the station is moved now, there will be another potential artifact in the record. An argument could certainly be made that continuity of a series is more important for long term monitoring. Debates over just how to compensate for it began seriously as early as 1967. After much debate the issue was pretty much settled, in terms of figuring out how to compensate for the urban effect and detecting a warming trend anyway, by 1990. Personally I got convinced that warming was underway in the late 1990s after borehole measurements in rocks around the world, far away from civilization, showed unmistakable evidence of warming over the past century... if you log temperature down the hole, you find that extra heat has been seeping down from the surface. I think any scientists not convinced by that would have been satisfied by the measurements of the oceans in the early 2000s that showed definitively that heat is seeping down there too. After all, most of the excess energy from any radiation imbalance will wind up in the oceans, and the top layers are ... |
realclimate.org Climate Science -- group @ 12:45 pm Anybody who has followed press reporting on global warming, and particularly on its effects on hurricanes, has surely encountered various contrarian pronouncements by William Gray, of Colorado State University. here by CNN), provides an illuminating window into Gray's thinking on the subject. Our discussion is not a point-by-point rebuttal of Gray's claims; there is far more wrong with the paper than we have the patience to detail. Gray will have plenty of opportunities to hear more about the work's shortcomings if it is ever subjected to the rigors of peer review. Here we will only highlight a few key points which illustrate the fundamental misconceptions on the physics of climate that underlie most of Gray's pronouncements on climate change and its causes. Gray's paper begins with a quote from Senator Inhofe calling global warming a hoax perpetrated on the American people, and ends with a quote by a representive of the Society of Petroleum Geologists stating that Crichton's State of Fear has "the absolute ring of truth." It is the gaping flaws in the scientific argument sandwiched between these two statements that are our major concern. documentary on the possible over-selling of climate change, focussed on the link between high profile papers appearing in Nature or Science, the press releases and the subsequent press coverage. The press coverage of the paper mostly picked up on the very high end sensitivities (up to 11-oC) and often confused the notion of an equilibirum sensitivity with an actual prediction for 2100 and this lead to some pretty way-out headlines. I think all involved would agree that this was not a big step forward in the public understanding of science. Is it because the scientists were being 'alarmist', or was it more related to a certain naivety in how public relations and the media work? And more importantly, what can scientists do to help ensure that media coverage is a fair reflection of their work? The Big Burp Theory of the Apocalypse (TimesSelect subscription required), which appeared in the New York Times of 18 April. This column is built around the possibility of a catastrophic methane release from marine clathrate decomposition, but at heart it is really a lament that the more conventional and better understood harms of global warming have not proved sufficient to get the attention of the White House or Congress. This column is a refreshing change from the recent spate of backlash columns by Will, Novak and Lindzen attempting to tar climate scientists with the "a****mist" epithet. Dave Archer's RealClimate article on clathrates, and it shows in the Kristof's sound discussion of the basic science. He is very clear on why a clathrate catastrophe would be a bad thing, but equally clear about the uncertainties. The column even contains an intelligent discussion of the Paleocene-Eocene Thermal Maximum as a possible example of a clathrate catastrophe. taking care to point out that this event might not, in fact, have been caused by methane release. Quite a lot to get in a short column, while still managing to achieve a lively style that surely keeps the readers awake. Perhaps closest to our hearts is Kristof's cogently stated theme that uncertainty is in the nature of the science, and is no excuse for inaction -- indeed should be a spur to greater action. "The White House has used scientific uncertainty as an excuse for its paralysis. But our leaders are supposed to devise policies to protect us even from threats that are difficult to assess precisely -- and climate change should be considered even more menacing than a nuclear-armed Iran." He concludes, "The best reason for action on global warming remains the basic imperative to safeguard our planet in the face of uncertainty, and our leaders are failing wretchedly in that responsibility." Kristof is a 2006 winner of the Pulitzer Prize for commentary. Some of this solar energy is reflected back out to space and this cooling effect is believed to have counteracted part of the greenhouse gas warming. The original version of the film focused mainly on the observational recognition of global dimming, but one aspect did not receive much attention in the film - namely the oft-claimed lack of global dimming in climate models. This led some to assume that climate modelers were ignoring air pollution other than greenhouse gases emissions from fossil fuel burning. Another implication was that climate models are not capable of adequately simulating the transfer of sunlight through the atmosphere and the role of clouds, sunlight extinction of aerosols and aerosol effects on clouds etc, and therefore model projections should not be trusted. The NOVA version will address this issue more prominently by adding an interview with Jim Hansen from NASA Goddard Institute for Space Studies. Along this line, I'd like to elaborate on aerosols in climate models in more detail. It deserves to be more widely seen, so here it is again. I would say that the central flaw in the op-ed is a logical one: if you're trying to stifle dissent, then you want less funding for climate research, not more. If you're trying to stop global warming, then you want more money for carbon sequestration research, and you don't care how much is spent on climate research. On the other hand if you just love climate research as a really interesting intellectual pursuit, that's when you've got an interest in shedding doubt on the reigning view that CO2-induced climate change is a serious policy program, requiring action. Twenty-five years ago, when global warming wasn't a big public worry, one might expect climate change researchers to hype the problem. In 2006, when public opinion mostly accepts that there's a problem, scientists who want research money should be emphasizing uncertainty. previously pointed out that there is no evidence whatsoever that 'alarmism' improves anyone's chances of getting funded - if anything it is continued uncertainty that propels funding decisions, and secondly, the idea that there is a conspiracy against contrarian scientists is laughable. There is indeed a conspiracy against poor science, but there is no need to apologise for that! But rather than repeat ourselves once again, we thought we'd just sit back this time and allow our readers to comment... Greenhouse gases -- rasmus @ 1:22 pm by Rasmus Benestad and Ray Pierrehumbert Venus Express will make unprecedented studies of the largely unkown phenomena taking place in the Venusian atmosphere. European Space Agency (ESA) mission to probe the the atmosphere of Venus and address questions regarding the differences between the climates on Venus and Earth. According to the plans, the probe will enter the final orbit around Venus in May 2006, ie within about a month. Primarily, Venus offers scientists the chance to see how the same basic physics used to study Earth's climate operates under a very different set of circumstances. rather similar to Earth: it has nearly the same mass as Earth, and while its orbit is somewhat closer to the Sun, that effect is more than made up for by the sunlight reflected from Venus' thick cloud cover. Because of the cloud cover, the surface temperature of Venus would be a chilly -42C if were not for the greenhouse effect of its atmosphere. "false objectivity of balance", ie the tendency for many journalists to treat scientific issues--for which differing positions often do not have equal merit-- in the same "he said, she said" manner they might treat a story on policy or politics. This approach can appear balanced, but it leaves it to the reader to figure out on their own which position is most likely correct. However, the reader is rarely as well equipped as the writer to determine the bottom line, and in practice this plays into the hands of those who might seek to confuse the public through clever disinformation campaigns. Thankfully, some journalists "get it", and take the time (and effort) to assess where the balance of evidence really lies and report it accordingly. Climate Science -- gavin @ 11:45 am One of the nice things about a being a scientist is that you c... |