Calling all Zooniverse volunteers! As we transition from the dog days of summer to the pumpkin spice latte days of fall (well, in the Northern hemisphere at least) it’s time to mobilize and do science!
Our Zooniverse community of over 1.3 million volunteers has the ability to focus efforts and get stuff done. Join us for the Sunspotter Citizen Science Challenge! From August 29th to September 5th, it’s a mad sprint to complete 250,000 classifications on Sunspotter.
Sunspotter needs your help so that we can better understand and predict how the Sun’s magnetic activity affects us on Earth. The Sunspotter science team has three primary goals:
Hone a more accurate measure of sunspot group complexity
Improve how well we are able to forecast solar activity
Create a machine-learning algorithm based on your classifications to automate the ranking of sunspot group complexity
In order to achieve these goals, volunteers like you compare two sunspot group images taken by the Solar and Heliospheric Observatory and choose the one you think is more complex. Sunspotter is what we refer to as a “popcorn project”. This means you can jump right in to the project and that each classification is quick, about 1-3 seconds.
Let’s all roll up our sleeves and advance our knowledge of heliophysics!
I and other Galaxy Zoo and Zooniverse scientists are looking forward to the Citizen Science Association (CSA) and American Association for the Advancement of Scientists (AAAS) meetings in San Jose, California this week.
As I mentioned in an earlier post, we’ve organized an AAAS session that is titled, “Citizen Science from the Zooniverse: Cutting-Edge Research with 1 Million Scientists,” which will take place on Friday afternoon. It fits well with the AAAS’s them this year: “Innovations, Information, and Imaging.” Our excellent line-up includes Laura Whyte (Adler) on Zooniverse, Brooke Simmons (Oxford) on Galaxy Zoo, Alexandra Swanson (U. of Minnesota) on Snapshot Serengeti, Kevin Wood (U. of Washington) on Old Weather, Paul Pharoah (Cambridge) on Cell Slider, and Phil Marshall (Stanford) on Space Warps.
And in other recent Zooniverse news, which you may have heard already, citizen scientists from the Milky Way Project examined infrared images from NASA’s Spitzer Space Telescope and found lots of “yellow balls” in our galaxy. It turns out that these are indications of early stages of massive star formation, such that the new stars heat up the dust grains around them. Charles Kerton and Grace Wolf-Chase have published the results in the Astrophysical Journal.
But let’s get back to the AAAS meeting. It looks like many other talks, sessions, and papers presented there involve citizen science too. David Baker (FoldIt) will give plenary lecture on post-evolutionary biology and protein structures on Saturday afternoon. Jennifer Shirk (Cornell), Meg Domroese and others from CSA have a session Sunday morning, in which they will describe ways to utilize citizen science for public engagement. (See also this related session on science communication.) Then in a session Sunday afternoon, people from the European Commission and other institutions will speak about global earth observation systems and citizen scientists tackling urban environmental hazards.
Before all of that, we’re excited to attend the CSA’s pre-conference on Wednesday and Thursday. (See their online program.) Chris Filardi (Director of Pacific Programs, Center for Biodiversity and Conservation, American Museum of Natural History) and Amy Robinson (Executive Director of EyeWire, a game to map the neural circuits of the brain) will give the keynote addresses there. For the rest of the meeting, as with the AAAS, there will be parallel sessions.
The first day of the CSA meeting will include: many sessions on education and learning at multiple levels; sessions on diversity, inclusion, and broadening engagement; a session on defining and measuring engagement, participation, and motivations; a session on CO2 and air quality monitoring; a session on CS in biomedical research;
and sessions on best practices for designing and implementing CS projects, including a talk by Chris Lintott on the Zooniverse and Nicole Gugliucci on CosmoQuest. The second day will bring many more talks and presentations along these and related themes, including one by Julie Feldt about educational interventions in Zooniverse projects and one by Laura Whyte about Chicago Wildlife Watch.
I also just heard that the Commons Lab at the Woodrow Wilson Center is releasing two new reports today, and hardcopies will be available at the CSA meeting. One report is by Muki Haklay (UCL) about “Citizen Science and Policy: A European Perspective” and the other is by Teresa Scassa & Haewon Chung (U. of Ottawa) about “Typology of Citizen Science Projects from an Intellectual Property Perspective.” Look here for more information.
In any case, we’re looking forward to these meetings, and we’ll keep you updated!
Yesterday we launched our latest project: Penguin Watch. It is already proving to be one of the most popular projects we run, with over one hundred thousand classifications in the first day! The data come from 50 cameras focussed on the nesting areas of penguin colonies around the Southern Ocean. Volunteers are asked to tag adult penguins, chicks, and eggs.
Here are my favourite images uncovered by our volunteers so far: (click on an image to see what people are saying about it on Penguin Watch Talk)
The Constructing Scientific Communities project (ConSciCom), part of the AHRC’s ‘Science in Culture’ theme, is inviting proposals for citizen science or citizen humanities projects to be developed as part of the Zooniverse platform.
ConSciCom examines citizen science in the 19th and 21st centuries, contrasting and reflecting on engagement with distributed communities of amateur researchers in both the historical record and in contemporary practice.
Between one and four successful projects will be selected from responses to this call, and will be developed and hosted by the Zooniverse in association with the applications. We hope to include both scientific and historical projects; those writing proposals should review the existing range of Zooniverse projects which include not only classification, but also transcription projects. Please note, however, ConSciCom cannot distribute funds nor support imaging or other digitization in support of the project.
Projects will be selected according to the following criteria:
Merit and usefulness of the data expected to result from the project.
Novelty of the problem; projects which require extending the capability of the Zooniverse platform or serve as case studies for crowdsourcing in new areas or in new ways are welcome.
Alignment with the goals and interests of the Constructing Scientific Communities project. In particular, we wish to encourage projects that:
Have a significant historical dimension, especially in relation to the history of science.
Involve the transcription of text, either in its entirety or for rich metadata.
Note it is anticipated that some, but not necessarily all selected projects, will meet this third criterion; please do submit proposals on other topics.
A few months ago we quietly placed a new project online. Called Sunspotter, it was essentially a game of hot-or-not for sunspot data – and since there were not many images available at the time, we thought it best to just let it be used by the people who noticed it, or who had tried it during the beta test. The results have since been validated, and the site works! In fact there are even preliminary results, which is all very exciting. Loads of new images have now been prepared, so today Sunspotter gets its proper debut. Try it at www.sunspotter.org.
On the site you are shown two images of sunspot groups and asked which is more complex. That might sound odd at first, but really it’s quite easy. The idea behind the science of Sunspotter is summed-up neatly on the Sunspotter blog:
I’m pretty sure you have an idea of which is the more complex: a graduate text on quantum mechanics, or an Italian cookbook? On the other hand, it would not be straight-forward for a computer to make that choice. The same is true with sunspot groups.
Or put another way: like many things in life, you’ll know complexity when you see it. Try it out now: it works on laptops, desktops, tablets and phones and you can keep up to date on Twitter, Facebook, G+, and the project’s own blog.
The aim is to bring together for a week computer scientists, machine learning experts, and the scientists from astronomy and planetary science based citizen science projects with the goal of taking the first steps towards addressing the critical questions and issues that citizen science will need to solve in order to cope with the petabtye data deluge from the the next generation observatories and space missions like the Square Kilometer Array (SKA) and the Large Synoptic Survey Telescope (LSST). I think it’s fair to say this is the largest gathering of Zooniverse astronomy and planetary science project teams assembled in one place. I’m looking forward to what new algorithms to better combine and assess your classifications will be developed during the week and to all the interesting results that will come out of this workshop.
In addition to the main workshop, there will be a teacher workshop held on March 2nd for local teachers in Taiwan co-organized by Lauren Huang (ASIAA), Mei-Yin Chou (ASIAA), Stuart Lynn (Adler Planetarium/Zooniverse), Kelly Borden (Adler Planetarium/Zooniverse), and myself . In preparation for the workshop, the ASIAA Education and Public Outreach (EPO) Office translated Planet Four into traditional character Chinese. You can find out more about the translation effort here. At the teacher workshop, we’ll be introducing citizen science and how it can be used in the classroom along with presenting the traditional character Chinese translations of Planet Four and Galaxy Zoo.
The first day of the main workshop will be a series of introductory talks aimed at getting everyone thinking for the working sessions later in the week. If you’re interested in watching the workshop talks, we’re going to try webcasting the first day’s sessions on Google+ starting on March 3rd 9:30am in Taiwan (March 2nd at 8:30pm EST ). The schedule for Monday can be found here. You can find the links for the morning and afternoon video live streams here. If you can’t watch live, the video will be archived and available on youtube through the same link.
You can follow along for the rest of the week on Twitter with the hashtag #csatro.
The Zooniverse is extremely pleased to announce that it has been named as one of six Google Global Impact Awardees announced in December 2013. The awards are given by Google to projects that show three key elements:
technology or innovative approach that can deliver transformational impact
a specific project that tests a big game-changing idea
a brilliant team with a healthy disregard for the impossible
The grant we have received from Google as part of their Global Impact Award program will allow us to build a platform that can support hundreds or maybe even thousands of new and exciting citizen science projects. A list of the awardees can be seen at the Google Global Impact Award site here http://www.google.com/giving/global-impact-awards/
It means a lot to us at the Zooniverse to have been given this award and we could not have managed it without you, our volunteers. The time and effort you dedicate to our projects shows the world how important citizen science can be, and we’re looking forward to the next few years.
So thanks to you, and thanks to Google!
The Zooniverse team
PS: Just to be clear, this is a philanthropic act from Google – we’ll continue to be an academic project run by the team at Oxford, Adler Planetarium and elsewhere and all your data remains with the Zooniverse as before. Nothing changes, except our ability to scale!
This time last year we launched the Andromeda Project. The aim was the get everyone’s help in locating the star clusters in the Andromeda Galaxy, our next-door neighbour in intergalactic space. The project went better than we could have imagined, and just over two weeks later we had completed more than 1,000,000 classifications and the project’s science team were busy wrangling data.
In fact, in January Cliff Johnson took a poster to one of the world’s biggest astronomy meetings – the January meeting of the AAS – and presented the results from the Andromeda Project, which had only launched 6 weeks prior. It was an amazing example of the power of citizen science to help researchers accomplish the kind of data analysis that computers cannot do reliably.
We decided to do a second round of the Andromeda Project to complete the job we’d started, using both the data that remained in the archive and also new data that was only just being taken last year when the project launched. So in October 2013 (just two months ago) we once again invited the Zooniverse community to come and find star clusters and galaxies. They once again astounded us by gobbling up the data even faster – ably assisted by a trench of new users brought to the project from Facebook’s popular I F***king Love Science page. In a week the job was done.
The science team have already begun processing the data from this second round and the results are amazing. In fact: they’re right here just for you, just because it’s nearly Christmas and just because we wanted to give you a present. So here they are: the first maps of all the star clusters and galaxies in the data from the PHAT survey of Andromeda. Marked and classified by the wonderful Andromeda Project community.
You can see how the background galaxies are best seen at the outer edges (because we are looking through less material), and the clusters are found predominately in the spiral arms (where more star formation is happening). These plots will form part of the publications the science team and currently working on, and which will most likely appear on the Zooniverse Publications page sometime in 2014. Follow along on the blog, Twitter and Facebook for updates from the science team in the coming weeks and months.
Congratulations to everyone who helped out and gave their time to the Andromeda Project: you were amazing!
So as much as I’d like to wish the Andromeda Project a happy birthday, it seems like I should really wish it a happy retirement. Luckily we have more space-based projects coming soon to the Zooniverse – so the community will have plenty to get along with. However, the Andromeda Project will always have a special place in our hearts for its efficient and dedicated volunteers. Who knows, maybe one day it will come out of retirement for one last hurrah? We can only hope.
One of the best things about being an educator on the Zooniverse development team is the opportunity to interact with teachers who are using Zooniverse projects in their classroom and teachers who are interested in using Zooniverse projects in the classroom. Teachers cite several reasons about why they use these projects – Authentic data? Check. Contributing to cutting-edge research across a variety of scientific fields? Check. Free? Check. Classifying a few galaxies in Galaxy Zoo or identifying and measuring some plankton in Plankton Portal can be an exciting introduction to participating in scientific investigations with “the professionals.” This isn’t enough though; teachers and other educators are hungry for ways to facilitate deeper student engagement with scientific data. Zooniverse educators and developers are consistently asked “How can my students dig deeper into the data on Zooniverse?”
This is where ZooTools comes into play. The Zooniverse development team has recently created ZooTools as a place where volunteers can observe, collect, and analyze data from Zooniverse citizen science projects. These tools were initially conceived as a toolkit for adult volunteers to use to make discoveries within Zooniverse data but it is becoming apparent that these would also have useful applications in formal education settings. It’s worth pointing out that these tools are currently in beta. In the world of web development beta basically means “it ain’t perfect yet.” ZooTools is not polished and perfect; in fact it’s possible you may encounter some bugs.
Projects like Galaxy Zoo and Planet Hunters have an impressive history of “extra credit” discoveries made by volunteers. Galaxy Zoo volunteers have made major contributions to the astronomy literature through the discovery of the green peas galaxies and Hanny’s Voorwerp . In Planet Hunters volunteers use Talk to share methods of exploring and results from the project’s light curves. ZooTools lowers the barrier of entry by equipping volunteers with some simple tools to look for interesting relationships and results contained within the data. No specialist knowledge required.
We’ve only begun thinking about how ZooTools could be used in the classroom. I started my own investigation with a question that came from a Zooniverse classroom visit from last spring. While making observations as a class about some of the amazing animals in Snapshot Serengeti one young man asked about civets. He wanted to know If they were nocturnal. We had an interesting discussion about how you could find out this information. The general consensus was to Google it or look it up on Wikipedia. I wondered if you could use the data contained within Snapshot Serengeti to come up with a reasonable answer. I was excited to roll-up my sleeves and figure out how to use these tools to find a likely answer. Here are the steps I took…
Step 2: Select a project. Currently only have a few projects have data available to explore using ZooTools.
Step 3: Create a dashboard.
Step 4: Name your dashboard something awesome. I called mine Civets! for obvious reasons.
Step 5: This is your blank dashboard.
Step 6: It’s time to select a data source. I selected Snapshot Serengeti.
Step 7: This is the data source.
Step 8: I wanted to be able to filter my data so I selected Filter under search type. The name of this dataset in Snapshot Serengeti 1.
Step 9: Since I wanted to look at civets, I selected that on the species dropdown menu and then clicked Load Data. My dataset will only contain images that Snapshot Serengeti volunteers identified as civets.
Step 10: I had my data; next it was time to select a Tool. I selected Tools at the top of the page.
Step 11: I selected Subject Viewer because this tool allows my to flip through different images.
Step 12: Next I had to connect my data source to my tool. From the Data Source drop down menu I selected Snapshot Serengeti 1.
Step 13: In order to get a good luck at the images in my dataset I clicked the icon shaped like a fork to close the pane. I then used the arrows to advance through the images.
I flipped through the images and kept track of the night versus day. Of the 37 images in my dataset, I observed that 34 were taken at night and 3 were taken during the day. This led me to the conclusion that civets are likely nocturnal. This was so much more satisfying than just going to Google or Wikipedia. A couple of other questions that I explored…
What is the distribution of animals identified at one camera trap site?
How many honeybadgers have been observed by Snapshot Serengeti volunteers across different camera traps?
Of course this is just the tip of the iceberg. Currently you can explore Galaxy Zoo, Space Warps, and Snapshot Serengeti data using ZooTools. Currently you can use ZooTools to explore data from Galaxy Zoo, Space Warps, and Snapshot Serengeti. The specific tools and datasets available vary from project to project. In Galaxy Zoo for example you can look at data from Galaxy Zoo classifications or from SDSS Skyserver. Hopefully you’ll be inspired to have a play with these tools! What questions would you or your students like to explore?
In the coming months the Zooniverse Education Blog will feature guest posts from participants in the Zooniverse Teacher Ambassadors Workshop. Today’s guest blogger William H. Waller is author of The Milky Way — An Insider’s Guide and co-editor of The Galactic Inquirer— an e-journal and forum on the topics of galactic and extragalactic astronomy, cosmochemistry and astrobiology, and interstellar communications. Bill’s day job involves teaching courses in physics and astronomy at Rockport High School.
For most of human history, the night sky demanded our attention. The shape-shifting Moon, wandering planets, pointillist stars, and occasional comet enchanted our sensibilities while inspiring diverse tales of origin. The Milky Way, in particular, exerted a powerful presence on our distant ancestors. Rippling across the firmament, this irregular band of ghostly light evoked myriad myths of life and death among the stars. In 1609, Galileo Galilei pointed his telescope heavenward and discovered that the Milky Way is “nothing but a congeries of innumerable stars grouped together in clusters.” Fast forward 400 years to the present day, and we find that the Milky Way has all but disappeared from our collective consciousness. Where did it go?
For 25 years as an astronomy educator, I have informally polled hundreds of students, teachers, and the general public regarding their awareness of the night sky. Invariably, no more than 25 percent have ever seen the Milky Way with their own eyes. For city dwellers, this is completely understandable. Unless properly shielded, the artificial lighting from municipal, commercial, and residential sources will spill into the sky and overwhelm the diffuse band of luminescence that is the hallmark of our home galaxy. The recent video “The City Dark” produced by POV underscores the disruptive aspects that artificial lighting can produce on the life cycles of certain animals – and even upon ourselves.
For residents of small towns well away from large cities (such as my own hometown of Rockport, MA), it is much easier to find dark “sanctuaries” where the Milky Way can be spied in all its exquisite beauty. Yet when I poll Rockport’s sundry inhabitants about having ever seen the Milky Way, I still get a measly 25% positive response. What’s going on here?
Is it that they don’t care about astronomy and the night sky? I would have to say that such astronomical indifference is not typical. Most people in conversations with me will volunteer their fascination for the planets, stars, and the exotica that our universe provides in abundance – from exoplanets to pulsars, black holes, dark matter, and dark energy. Images from our great space telescopes have also revealed to the casual viewer many marvels of the Milky Way Galaxy, other nearby galaxies, and the remote galaxian cosmos. Recently, stunning composite images of X-ray, visible, and infrared emission from regions of cosmic tumult have vivified the many powerful dramas that continue to unfold upon the galactic stage.
Yet, despite popular enthusiasm for the wonders of space, most people still do not bother to find a dark site and witness the source of these wonders for themselves. Otherwise, my informal polling would have indicated that they knew about the Milky Way as a naked-eye marvel. I suppose it comes down to the delivery of experiences. We have grown accustomed to having our experiences conveyed to us in familiar, safe, and readily-accessible packages – be they books, magazines, television programs, planetarium shows, or interactive websites.
Regarding the latter, consider the Zooniverse online portal where anybody with an internet connection can contribute to authentic scientific research. With just your eyes and hands, you can search for exoplanets around distant suns, trace out star-blown bubbles in our galaxy’s interstellar medium, and categorize the types of galaxies that dwell in deep space. To date, close to a million people have contributed to these and sundry other online scientific investigations.
Then there are the mobile apps. One popular type of app, in particular, has brought millions more people closer to the night sky. Google Sky Map, Droid Sky View, The Night Sky, and other interactive planetarium simulators enable a smartphone user to point the phone in any direction and see what stars and constellations are located there. Most of these simulators show the Milky Way as a hazy band, thus cueing the viewer to its existence. But does that mean that more people are making the effort to find dark sites for smartphone-aided star gazing? Is participation in amateur astronomy clubs on the rise as a result? And are star parties at our national parks surging with attendees? My very limited research on these questions suggests that – yes – ever more people are seeking the sublime wonders of dark skies. Whether such interactive apps are responsible for these trends remains unknown. Still, I remain optimistic.
Perhaps our electronic addictions and virtual realities will ultimately re-introduce ourselves to the unembellished Milky Way – and to other direct experiences that Nature so generously provides. We may be plugged-in as never before, but still we hunger for authentic interactions with the mysterious ways of Nature. Towards these ends, I urge that we re-double our efforts to preserve the dark night sky through the advocacy of properly-shielded lighting and the establishment of dark-sky sanctuaries. To help in these regards, please visit the International Dark Sky Society’s webpage.
The Zooniverse Blog. We're the world's largest and most successful citizen science platform and a collaboration between the University of Oxford, The Adler Planetarium, and friends