Category:Mining
|
This is the category for mining. Refresh this list to see the latest articles.
|
From Wikinews, the free news source you can write. Sister projects
|
US reveals Internet security effortsThursday, March 4, 2010
The United States announced the declassification of a portion of the Comprehensive National Cybersecurity Initiative, a major part of the US’s efforts to thwart cyber warfare, on Tuesday.
The announcement came at the RSA Security Conference in San Francisco, and was given by Howard Schmidt, who is the current US cyber-security coordinator, having been assigned the position in December.
While only a portion of the document was revealed at the announcement, and much remains classified material, including all material related to plans by the government for offensive cyber-warfare, the program has twelve parts, and has three main strategies:
The program includes funding for numerous security measures, including the government’s controversial Einstein program, which scans all incoming communications to government-operated websites. The plan also mentions increasing security for classified networks within the government, as well as developing a government-wide plan for counter-intelligence work, although the declassified portions gave little indication as to what that would involve.
The program was begun by President George Bush in 2008 as a National Security Presidential Directive, and has been entirely classified until now. At its inception, it was intended to serve as a program to unify cyber-security efforts within the government and to develop other security programs for use nationwide. No budget has been released for the program, although estimates place the cost at $40 billion until 2015.
Renault F1 team exclusion overturnedMonday, August 17, 2009
According to an FIA press release, the 1-race ban imposed on the Renault Formula One team following an incident at the Hungarian Grand Prix has been overturned.
The team was reprimanded after allowing Alonso to leave the pit lane with the wheel insecurely attached to the car, which then detached itself out on the circuit. The wheel bounced dangerously down the back straight, fortunately not coming into contact with any other vehicles.
After hearing evidence from all parties, including Renault’s Engineering Director Pat Symonds, the FIA’s International Court of Appeal has repealed the one race suspension, and has instead imposed a $50,000 fine and have “issued a reprimand”.
The decision allows Fernando Alonso to race in his second of two home Grands Prix this year, and will likely open the door for Renault’s test driver Romain Grosjean to take up the second seat, following the dismissal of seasoned underperformer Nelson Piquet Jr.
Keep your eyes peeled for cosmic debris: Andrew Westphal about Stardust@homeSunday, May 28, 2006
Stardust is a NASA space capsule that collected samples from comet 81P/Wild (also known as “Wild 2) in deep space and landed back on Earth on January 15, 2006. It was decided that a collaborative online review process would be used to “discover” the microscopically small samples the capsule collected. The project is called Stardust@home. Unlike distributed computing projects like SETI@home, Stardust@home relies entirely on human intelligence.
Andrew Westphal is the director of Stardust@home. Wikinews interviewed him for May’s Interview of the Month (IOTM) on May 18, 2006. As always, the interview was conducted on IRC, with multiple people asking questions.
Some may not know exactly what Stardust or Stardust@home is. Can you explain more about it for us?
Stardust is a NASA Discovery mission that was launched in 1999. It is really two missions in one. The primary science goal of the mission was to collect a sample from a known primitive solar-system body, a comet called Wild 2 (pronounced “Vilt-two” — the discoverer was German, I believe). This is the first [US]] “sample return” mission since Apollo, and the first ever from beyond the moon. This gives a little context. By “sample return” of course I mean a mission that brings back extraterrestrial material. I should have said above that this is the first “solid” sample return mission — Genesis brought back a sample from the Sun almost two years ago, but Stardust is also bringing back the first solid samples from the local interstellar medium — basically this is a sample of the Galaxy. This is absolutely unprecedented, and we’re obviously incredibly excited. I should mention parenthetically that there is a fantastic launch video — taken from the POV of the rocket on the JPL Stardust website — highly recommended — best I’ve ever seen — all the way from the launch pad, too. Basically interplanetary trajectory. Absolutely great.
Is the video available to the public?
Yes [see below]. OK, I digress. The first challenge that we have before can do any kind of analysis of these interstellar dust particles is simply to find them. This is a big challenge because they are very small (order of micron in size) and are somewhere (we don’t know where) on a HUGE collector— at least on the scale of the particle size — about a tenth of a square meter. So…
We’re right now using an automated microscope that we developed several years ago for nuclear astrophysics work to scan the collector in the Cosmic Dust Lab in Building 31 at Johnson Space Center. This is the ARES group that handles returned samples (Moon Rocks, Genesis chips, Meteorites, and Interplanetary Dust Particles collected by U2 in the stratosphere). The microscope collects stacks of digital images of the aerogel collectors in the array. These images are sent to us — we compress them and convert them into a format appropriate for Stardust@home.
Stardust@home is a highly distributed project using a “Virtual Microscope” that is written in html and javascript and runs on most browsers — no downloads are required. Using the Virtual Microscope volunteers can search over the collector for the tracks of the interstellar dust particles.
How many samples do you anticipate being found during the course of the project?
Great question. The short answer is that we don’t know. The long answer is a bit more complicated. Here’s what we know. The Galileo and Ulysses spacecraft carried dust detectors onboard that Eberhard Gruen and his colleagues used to first detect and them measure the flux of interstellar dust particles streaming into the solar system. (This is a kind of “wind” of interstellar dust, caused by the fact that our solar system is moving with respect to the local interstellar medium.) Markus Landgraf has estimated the number of interstellar dust particles that should have been captured by Stardust during two periods of the “cruise” phase of the interplanetary orbit in which the spacecraft was moving with this wind. He estimated that there should be around 45 particles, but this number is very uncertain — I wouldn’t be surprised if it is quite different from that. That was the long answer! One thing that I should say…is that like all research, the outcome of what we are doing is highly uncertain. There is a wonderful quote attributed to Einstein — “If we knew what we were doing, it wouldn’t be called “research”, would it?”
How big would the samples be?
We expect that the particles will be of order a micron in size. (A millionth of a meter.) When people are searching using the virtual microscope, they will be looking not for the particles, but for the tracks that the particles make, which are much larger — several microns in diameter. Just yesterday we switched over to a new site which has a demo of the VM (virtual microscope) I invite you to check it out. The tracks in the demo are from submicron carbonyl iron particles that were shot into aerogel using a particle accelerator modified to accelerate dust particles to very high speeds, to simulate the interstellar dust impacts that we’re looking for.
And that’s on the main Stardust@home website [see below]?
Yes.
How long will the project take to complete?
Partly the answer depends on what you mean by “the project”. The search will take several months. The bottleneck, we expect (but don’t really know yet) is in the scanning — we can only scan about one tile per day and there are 130 tiles in the collector…. These particles will be quite diverse, so we’re hoping that we’ll continue to have lots of volunteers collaborating with us on this after the initial discoveries. It may be that the 50th particle that we find will be the real Rosetta stone that turns out to be critical to our understanding of interstellar dust. So we really want to find them all! Enlarging the idea of the project a little, beyond the search, though is to actually analyze these particles. That’s the whole point, obviously!
And this is the huge advantage with this kind of a mission — a “sample return” mission.
Most missions rather do things quite differently… you have to build an instrument to make a measurement and that instrument design gets locked in several years before launch practically guaranteeing that it will be obsolete by the time you launch. Here exactly the opposite is true. Several of the instruments that are now being used to analyze the cometary dust did not exist when the mission was launched. Further, some instruments (e.g., synchrotrons) are the size of shopping malls — you don’t have a hope of flying these in space. So we can and will study these samples for many years. AND we have to preserve some of these dust particles for our grandchildren to analyze with their hyper-quark-gluon plasma microscopes (or whatever)!
When do you anticipate the project to start?
We’re really frustrated with the delays that we’ve been having. Some of it has to do with learning how to deal with the aerogel collectors, which are rougher and more fractured than we expected. The good news is that they are pretty clean — there is very little of the dust that you see on our training images — these were deliberately left out in the lab to collect dust so that we could give people experience with the worst case we could think of. In learning how to do the scanning of the actual flight aerogel, we uncovered a couple of bugs in our scanning software — which forced us to go back and rescan. Part of the other reason for the delay was that we had to learn how to handle the collector — it would cost $200M to replace it if something happened to it, so we had to develop procedures to deal with it, and add several new safety features to the Cosmic Dust Lab. This all took time. Finally, we’re distracted because we also have many responsibilities for the cometary analysis, which has a deadline of August 15 for finishing analysis. The IS project has no such deadline, so at times we had to delay the IS (interstellar, sorry) in order to focus on the cometary work. We are very grateful to everyone for their patience on this — I mean that very sincerely.
And rest assured that we’re just as frustrated!
I know there will be a “test” that participants will have to take before they can examine the “real thing”. What will that test consist of?
The test will look very similar to the training images that you can look at now. But.. there will of course be no annotation to tell you where the tracks are!
Why did NASA decide to take the route of distributed computing? Will they do this again?
I wouldn’t say that NASA decided to do this — the idea for Stardust@home originated here at U. C. Berkeley. Part of the idea of course came…
If I understand correctly it isn’t distributed computing, but distributed eyeballing?
…from the SETI@home people who are just down the hall from us. But as Brian just pointed out. this is not really distributed computing like SETI@home the computers are just platforms for the VM and it is human eyes and brains who are doing the real work which makes it fun (IMHO).
That said… There have been quite a few people who have expressed interested in developing automated algorithms for searching. Just because WE don’t know how to write such an algorithm doesn’t mean nobody does. We’re delighted at this and are happy to help make it happen
Isn’t there a catch 22 that the data you’re going to collect would be a prerequisite to automating the process?
That was the conclusion that we came to early on — that we would need some sort of training set to be able to train an algorithm. Of course you have to train people too, but we’re hoping (we’ll see!) that people are more flexible in recognizing things that they’ve never seen before and pointing them out. Our experience is that people who have never seen a track in aerogel can learn to recognize them very quickly, even against a big background of cracks, dust and other sources of confusion… Coming back to the original question — although NASA didn’t originate the idea, they are very generously supporting this project. It wouldn’t have happened without NASA’s financial support (and of course access to the Stardust collector). Did that answer the question?
Will a project like this be done again?
I don’t know… There are only a few projects for which this approach makes sense… In fact, I frankly haven’t run across another at least in Space Science. But I am totally open to the idea of it. I am not in favor of just doing it as “make-work” — that is just artificially taking this approach when another approach would make more sense.
How did the idea come up to do this kind of project?
Really desperation. When we first thought about this we assumed that we would use some sort of automated image recognition technique. We asked some experts around here in CS and the conclusion was that the problem was somewhere between trivial and impossible, and we wouldn’t know until we had some real examples to work with. So we talked with Dan Wertheimer and Dave Anderson (literally down the hall from us) about the idea of a distributed project, and they were quite encouraging. Dave proposed the VM machinery, and Josh Von Korff, a physics grad student, implemented it. (Beautifully, I think. I take no credit!)
I got to meet one of the stardust directors in March during the Texas Aerospace Scholars program at JSC. She talked about searching for meteors in Antarctica, one that were unblemished by Earth conditions. Is that our best chance of finding new information on comets and asteroids? Or will more Stardust programs be our best solution?
That’s a really good question. Much will depend on what we learn during this official “Preliminary Examination” period for the cometary analysis. Aerogel capture is pretty darn good, but it’s not perfect and things are altered during capture in ways that we’re still understanding. I think that much also depends on what question you’re asking. For example, some of the most important science is done by measuring the relative abundances of isotopes in samples, and these are not affected (at least not much) by capture into aerogel.
Also, she talked about how some of the agencies that they gave samples to had lost or destroyed 2-3 samples while trying to analyze them. That one, in fact, had been statically charged, and stuck to the side of the microscope lens and they spent over an hour looking for it. Is that really our biggest danger? Giving out samples as a show of good faith, and not letting NASA example all samples collected?
These will be the first measurements, probably, that we’ll make on the interstellar dust There is always a risk of loss. Fortunately for the cometary samples there is quite a lot there, so it’s not a disaster. NASA has some analytical capabilities, particularly at JSC, but the vast majority of the analytical capability in the community is not at NASA but is at universities, government labs and other institutions all over the world. I should also point out that practically every analytical technique is destructive at some level. (There are a few exceptions, but not many.) The problem with meteorites is that except in a very few cases, we don’t know where they specifically came from. So having a sample that we know for sure is from the comet is golden!
I am currently working on my Bachelor’s in computer science, with a minor in astronomy. Do you see successes of programs like Stardust to open up more private space exploration positions for people such as myself. Even though I’m not in the typical “space” fields of education?
Can you elaborate on your question a little — I’m not sure that I understand…
Well, while at JSC I learned that they mostly want Engineers, and a few science grads, and I worry that my computer science degree with not be very valuable, as the NASA rep told me only 1% of the applicants for their work study program are CS majors. I’m just curious as to your thoughts on if CS majors will be more in demand now that projects like Stardust and the Mars missions have been great successes? Have you seen a trend towards more private businesses moving in that direction, especially with President Bush’s statement of Man on the Moon in 2015?
That’s a good question. I am personally not very optimistic about the direction that NASA is going. Despite recent successes, including but not limited to Stardust, science at NASA is being decimated.
I made a joke with some people at the TAS event that one day SpaceShipOne will be sent up to save stranded ISS astronauts. It makes me wonder what kind of private redundancy the US government is taking for future missions.
I guess one thing to be a little cautious about is that despite SpaceShipOne’s success, we haven’t had an orbital project that has been successful in that style of private enterprise It would be nice to see that happen. I know that there’s a lot of interest…!
Now I know the answer to this question… but a lot do not… When samples are found, How will they be analyzed? Who gets the credit for finding the samples?
The first person who identifies an interstellar dust particle will be acknowledged on the website (and probably will be much in demand for interviews from the media!), will have the privilege of naming the particle, and will be a co-author on any papers that WE (at UCB) publish on the analysis of the particle. Also, although we are precluded from paying for travel expenses, we will invite those who discover particles AND the top performers to our lab for a hands-on tour.
We have some fun things, including micromachines.
How many people/participants do you expect to have?
About 113,000 have preregistered on our website. Frankly, I don’t have a clue how many will actually volunteer and do a substantial amount of searching. We’ve never done this before, after all!
One last thing I want to say … well, two. First, we are going to special efforts not to do any searching ourselves before we go “live”. It would not be fair to all the volunteers for us to get a jumpstart on the search. All we are doing is looking at a few random views to make sure that the focus and illumination are good. (And we haven’t seen anything — no surprise at all!) Also, the attitude for this should be “Have Fun”. If you’re not having fun doing it, stop and do something else! A good maxim for life in general!
Climate change impacts WyomingTuesday, March 18, 2008
Cheek numbing, eye watering winds whip across the plains of the Laramie Basin, Wyoming. The ground is yellow brown with patches of recalcitrant snow. Sheep Mountain is losing its winter coat. All normal affairs for March. The March edition of the Wyoming Basin Outlook Report also reports, based on February accumulations, that Snow Water Equivalent is at 99% of average.
The SWE is a measure of the snow pack that feeds the streams, rivers and reservoirs that Wyoming, Nebraska and other states depend upon for water. Current averages are compared to the average SWE for 1971-2000. In recent years, snow pack in this region has been anything but normal.
The Outlook Reports are issued January to June. Since March 2000, only five of 46 months have been above normal. While many of the winter months have been near normal, June’s snow pack is far below average. Even in 2006, the wettest year of the last eight years, June snow pack was only 37% of the average.
In an e-mail interview with Wikinews, Lee Hackleman, Water Supply Specialist, said
| The snowpack is melting out several weeks earlier than average. The higher temperatures in the spring are responsible for this. There seems to be a significant drop in the amount of runoff that we are able to retain in our reservoirs, a lot of runoff seems to be soaking into the ground. We do not have the June flood events any more. We use to [sic] be cool then hot, not cool warm then hot. | ||
In a phone interview with Wikinews, Myra Wilensky of the National Wildlife Federation in nearby Colorado, also commented on changing snow patterns.
| In the west, nothing is ever clockwork, the patterns shift, a good amount of snowfall in the season and then a quick warm up. We don’t get the prolonged snowpack that we used to have. May have a really wet snow year, then really dry with rain.
Can’t count on getting estimated amount of snow anymore. March and November have historically been our snowiest months, but this year it’s been a fairly dry in March and November. Winter is shorter now. |
||
This is part of a general increase in temperature in the region. An Intergovernmental Panel on Climate Change cited by the National Wildlife Federation estimates that the temperature will rise almost 7 degrees (F) by 2100.
| This will likely cause most, if not all, of the state’s glaciers to disappear. Wildfires may increase, droughts could get worse and rains–when they do come–will likely come in more severe downpours that may cause more flash flooding. Warmer temperatures also mean less snowpack in the mountains, leading to more winter runoff and reduced summer flows in many Wyoming streams. | ||
The NWF’s main concern is the fate of the wildlife in the region, particularly how the impact of pine bark beetles. Warmer winters have led to mass infestations in Western lodge pole pine forests and The New York Times reports that they are now moving on to white bark pines in Yellowstone particularly impacting grizzly bears there. In turn, the grizzlies are shifting to feeding on Canadian thistle, an invasive species that might be choking out native plants.
Changing weather patterns have also affected large migratory animals.
| This year winter came late. When the heavy snows hit, the mule deer and the elk were spread out, had to be fed. Feeding isn’t newsworthy, happened before like in 1982 but it wasn’t as successful this year because they were so spread out. | ||
Water for people has also become a major issue in the region.
| There is a much greater concern for water rights than there used to be. There is not enough late season water to satisfy everyone all the time. | ||
Kansas has long fought Wyoming over water rights issues. And Montana is currently suing Wyoming, claiming that the Yellowstone River Compact signed in 1950 gives rights to both surface and ground water, while Wyoming disagrees. On February 18, the Supreme Court agreed to hear the lawsuit.
| Wyoming officials say they are adhering to the compact and that the drought has meant less water for both states.
But Montana says Wyoming is storing more water in reservoirs than the compact permits and allowing excessive pumping of groundwater reserves that feed into the two rivers. Those “groundwater” reserves are tapped by some Wyoming farmers to irrigate their fields. Energy companies discharge large volumes of groundwater during production of coal-bed methane, a type of natural gas prevalent in northern Wyoming. |
||
Authorities do not see this fight over increasingly limited water resources going away anytime soon.
| Everyone is going to have to learn to get by with less. | ||