Category Archives: Science

The Science of Citizen Science: Meetings in San Jose This Week

I and other Galaxy Zoo and Zooniverse scientists are looking forward to the Citizen Science Association (CSA) and American Association for the Advancement of Scientists (AAAS) meetings in San Jose, California this week.

As I mentioned in an earlier post, we’ve organized an AAAS session that is titled, “Citizen Science from the Zooniverse: Cutting-Edge Research with 1 Million Scientists,” which will take place on Friday afternoon. It fits well with the AAAS’s them this year: “Innovations, Information, and Imaging.” Our excellent line-up includes Laura Whyte (Adler) on Zooniverse, Brooke Simmons (Oxford) on Galaxy Zoo, Alexandra Swanson (U. of Minnesota) on Snapshot Serengeti, Kevin Wood (U. of Washington) on Old Weather, Paul Pharoah (Cambridge) on Cell Slider, and Phil Marshall (Stanford) on Space Warps.

And in other recent Zooniverse news, which you may have heard already, citizen scientists from the Milky Way Project examined infrared images from NASA’s Spitzer Space Telescope and found lots of “yellow balls” in our galaxy. It turns out that these are indications of early stages of massive star formation, such that the new stars heat up the dust grains around them. Charles Kerton and Grace Wolf-Chase have published the results in the Astrophysical Journal.

But let’s get back to the AAAS meeting. It looks like many other talks, sessions, and papers presented there involve citizen science too. David Baker (FoldIt) will give plenary lecture on post-evolutionary biology and protein structures on Saturday afternoon. Jennifer Shirk (Cornell), Meg Domroese and others from CSA have a session Sunday morning, in which they will describe ways to utilize citizen science for public engagement. (See also this related session on science communication.) Then in a session Sunday afternoon, people from the European Commission and other institutions will speak about global earth observation systems and citizen scientists tackling urban environmental hazards.

Before all of that, we’re excited to attend the CSA’s pre-conference on Wednesday and Thursday. (See their online program.) Chris Filardi (Director of Pacific Programs, Center for Biodiversity and Conservation, American Museum of Natural History) and Amy Robinson (Executive Director of EyeWire, a game to map the neural circuits of the brain) will give the keynote addresses there. For the rest of the meeting, as with the AAAS, there will be parallel sessions.

The first day of the CSA meeting will include: many sessions on education and learning at multiple levels; sessions on diversity, inclusion, and broadening engagement; a session on defining and measuring engagement, participation, and motivations; a session on CO2 and air quality monitoring; a session on CS in biomedical research;
and sessions on best practices for designing and implementing CS projects, including a talk by Chris Lintott on the Zooniverse and Nicole Gugliucci on CosmoQuest. The second day will bring many more talks and presentations along these and related themes, including one by Julie Feldt about educational interventions in Zooniverse projects and one by Laura Whyte about Chicago Wildlife Watch.

I also just heard that the Commons Lab at the Woodrow Wilson Center is releasing two new reports today, and hardcopies will be available at the CSA meeting. One report is by Muki Haklay (UCL) about “Citizen Science and Policy: A European Perspective” and the other is by Teresa Scassa & Haewon Chung (U. of Ottawa) about “Typology of Citizen Science Projects from an Intellectual Property Perspective.” Look here for more information.

In any case, we’re looking forward to these meetings, and we’ll keep you updated!

Penguin Watch: Top images so far

Yesterday we launched our latest project: Penguin Watch. It is already proving to be one of the most popular projects we run, with over one hundred thousand classifications in the first day! The data come from 50 cameras focussed on the nesting areas of penguin colonies around the Southern Ocean. Volunteers are asked to tag adult penguins, chicks, and eggs.

Here are my favourite images uncovered by our volunteers so far: (click on an image to see what people are saying about it on Penguin Watch Talk)

1st Rule of Penguin Watch - You don't have to count them all. But I dare you to!
1st Rule of Penguin Watch – You don’t have to count them all. But I dare you to!

 

Living on the edge
Living on the edge
Penguins aren't always only black and white...
Penguins are always only black and white…
I think they want in!
I think they want in!
Spot the tourists
Spot the tourists
We're saved!
We’re saved!
Coming back from a refreshing afternoon swim
Coming back from a refreshing afternoon swim

 

See what amazing pictures you can find right now at www.penguinwatch.org

Call for Proposals

conscicom

 

The Constructing Scientific Communities project (ConSciCom), part of the AHRC’s ‘Science in Culture’ theme, is inviting proposals for citizen science or citizen humanities projects to be developed as part of the Zooniverse platform.

ConSciCom examines citizen science in the 19th and 21st centuries, contrasting and reflecting on engagement with distributed communities of amateur researchers in both the historical record and in contemporary practice.

Between one and four successful projects will be selected from responses to this call, and will be developed and hosted by the Zooniverse in association with the applications. We hope to include both scientific and historical projects; those writing proposals should review the existing range of Zooniverse projects which include not only classification, but also transcription projects. Please note, however, ConSciCom cannot distribute funds nor support imaging or other digitization in support of the project.

Projects will be selected according to the following criteria:

  1. Merit and usefulness of the data expected to result from the project.
  2. Novelty of the problem; projects which require extending the capability of the Zooniverse platform or serve as case studies for crowdsourcing in new areas or in new ways are welcome.
  3. Alignment with the goals and interests of the Constructing Scientific Communities project. In particular, we wish to encourage projects that:
    1. Have a significant historical dimension, especially in relation to the history of science.
    2. Involve the transcription of text, either in its entirety or for rich metadata.

Note it is anticipated that some, but not necessarily all selected projects, will meet this third criterion; please do submit proposals on other topics.

The deadline for submissions is July 25th 2014. You can submit a proposal by following this link http://conscicom.org/proposals/form/

 

 

 

Become a Sunspotter and play Solar ‘Hot-or-Not’

avatar_sunspotter

A few months ago we quietly placed a new project online. Called Sunspotter, it was essentially a game of hot-or-not for sunspot data – and since there were not many images available at the time, we thought it best to just let it be used by the people who noticed it, or who had tried it during the beta test. The results have since been validated, and the site works! In fact there are even preliminary results, which is all very exciting. Loads of new images have now been prepared, so today Sunspotter gets its proper debut. Try it at www.sunspotter.org.

On the site you are shown two images of sunspot groups and asked which is more complex. That might sound odd at first, but really it’s quite easy. The idea behind the science of Sunspotter is summed-up neatly on the Sunspotter blog:

I’m pretty sure you have an idea of which is the more complex: a graduate text on quantum mechanics, or an Italian cookbook? On the other hand, it would not be straight-forward for a computer to make that choice. The same is true with sunspot groups.

Or put another way: like many things in life, you’ll know complexity when you see it. Try it out now: it works on laptops, desktops, tablets and phones and you can keep up to date on Twitter, Facebook, G+, and the project’s own blog.

Screenshot 2014-06-13 19.39.24

Workshop on Citizen Science in Astronomy

banner20140303

This weekend several members of the Zooniverse development team and many representatives from the science teams of Galaxy Zoo, Planet Hunters, Space Warps, Moon Zoo, Radio Galaxy Zoo, Planet Four, and the Andromeda Project are traveling to Taipei, Taiwan for the workshop on Citizen Science in Astronomy. I wrote about this workshop last November when it was announced. The Institute of Astronomy and Astrophysics at Academia Sinica (ASIAA)  in Taipei, Taiwan (with support from the National Science Council) along with the Zooniverse are organizing this workshop.

The aim is to bring together for a week  computer scientists, machine learning experts, and the scientists from astronomy and planetary science based citizen science projects with the goal of taking the first steps towards addressing the critical questions and issues that citizen science will need to solve in order to cope with the petabtye data deluge from the the next generation observatories and space missions like the Square Kilometer Array (SKA) and the Large Synoptic Survey Telescope (LSST). I think it’s fair to say this is the largest gathering of Zooniverse astronomy and planetary science project teams assembled in one place. I’m looking forward to what new algorithms to better combine and assess your classifications will be developed during the week and to all the interesting results that will come out of this workshop.

In addition to the main workshop, there will be a teacher workshop held on March 2nd for local teachers in Taiwan co-organized by Lauren Huang (ASIAA), Mei-Yin Chou (ASIAA), Stuart Lynn (Adler Planetarium/Zooniverse), Kelly Borden (Adler Planetarium/Zooniverse), and myself . In preparation for the workshop, the ASIAA Education and Public Outreach (EPO) Office translated Planet Four into traditional character Chinese. You can find out more about the translation effort here. At the teacher workshop, we’ll be introducing citizen science and how it can be used in the classroom along with presenting the traditional character Chinese translations of Planet Four and Galaxy Zoo.

The first day of the main workshop will be a series of introductory talks aimed at getting everyone thinking for the working sessions later in the week. If you’re interested in watching the workshop talks,  we’re going to try webcasting the first day’s sessions on Google+ starting on March 3rd 9:30am in Taiwan (March 2nd at 8:30pm EST ). The schedule for Monday can be found here. You can find the links for the morning and afternoon video live streams here. If you can’t watch live, the video will be archived and available on youtube through the same link.

You can follow along for the rest of the week on Twitter with the hashtag #csatro.

Google confirms that the Zooniverse is awesome!

The Zooniverse is extremely pleased to announce that it has been named as one of six Google Global Impact Awardees announced in December 2013. The awards are given by Google to projects that show three key elements:
  • technology or innovative approach that can deliver transformational impact
  • a specific project that tests a big game-changing idea
  • a brilliant team with a healthy disregard for the impossible

The grant we have received from Google as part of their Global Impact Award program will allow us to build a platform that can support hundreds or maybe even thousands of new and exciting citizen science projects. A list of the awardees can be seen at the Google Global Impact Award site here http://www.google.com/giving/global-impact-awards/

It means a lot to us at the Zooniverse to have been given this award and we could not have managed it without you, our volunteers. The time and effort you dedicate to our projects shows the world how important citizen science can be, and we’re looking forward to the next few years.

So thanks to you, and thanks to Google!

The Zooniverse team

PS: Just to be clear, this is a philanthropic act from Google – we’ll continue to be an academic project run by the team at Oxford, Adler Planetarium and elsewhere and all your data remains with the Zooniverse as before. Nothing changes, except our ability to scale!

Andromeda Project We Hardly Knew Ye

This time last year we launched the Andromeda Project. The aim was the get everyone’s help in locating the star clusters in the Andromeda Galaxy, our next-door neighbour in intergalactic space. The project went better than we could have imagined, and just over two weeks later we had completed more than 1,000,000 classifications and the project’s science team were busy wrangling data.

Cliff at AAS

In fact, in January Cliff Johnson took a poster to one of the world’s biggest astronomy meetings – the January meeting of the AAS – and presented the results from the Andromeda Project, which had only launched 6 weeks prior. It was an amazing example of the power of citizen science to help researchers accomplish the kind of data analysis that computers cannot do reliably.

We decided to do a second round of the Andromeda Project to complete the job we’d started, using both the data that remained in the archive and also new data that was only just being taken last year when the project launched. So in October 2013 (just two months ago) we once again invited the Zooniverse community to come and find star clusters and galaxies. They once again astounded us by gobbling up the data even faster – ably assisted by a trench of new users brought to the project from Facebook’s popular I F***king Love Science page. In a week the job was done.

The science team have already begun processing the data from this second round and the results are amazing. In fact: they’re right here just for you, just because it’s nearly Christmas and just because we wanted to give you a present. So here they are: the first maps of all the star clusters and galaxies in the data from the PHAT survey of Andromeda. Marked and classified by  the wonderful Andromeda Project community.

AP Map
Clusters are in blue, galaxies in red. The background image is single-band F475W data showing the galaxy itself.

You can see how the background galaxies are best seen at the outer edges (because we are looking through less material), and the clusters are found predominately in the spiral arms (where more star formation is happening). These plots will form part of the publications the science team and currently working on, and which will most likely appear on the Zooniverse Publications page sometime in 2014. Follow along on the blog, Twitter and Facebook for updates from the science team in the coming weeks and months.

Congratulations to everyone who helped out and gave their time to the Andromeda Project: you were amazing!

So as much as I’d like to wish the Andromeda Project a happy birthday, it seems like I should really wish it a happy retirement. Luckily we have more space-based projects coming soon to the Zooniverse – so the community will have plenty to get along with. However, the Andromeda Project will always have a special place in our hearts for its efficient and dedicated volunteers. Who knows, maybe one day it will come out of retirement for one last hurrah? We can only hope.

Andromeda Project, we hardly knew ye.

ZooTools: Going Deeper With Zooniverse Project Data

One of the best things about being an educator on the Zooniverse development team is the opportunity to interact with teachers who are using Zooniverse projects in their classroom and teachers who are interested in using Zooniverse projects in the classroom. Teachers cite several reasons about why they use these projects – Authentic data?  Check. Contributing to cutting-edge research across a variety of scientific fields?  Check.  Free?  Check. Classifying a few galaxies in Galaxy Zoo or identifying and measuring some plankton in Plankton Portal can be an exciting introduction to participating in scientific investigations with “the professionals.”  This isn’t enough though; teachers and other educators are hungry for ways to facilitate deeper student engagement with scientific data. Zooniverse educators and developers are consistently asked “How can my students dig deeper into the data on Zooniverse?”

This is where ZooTools comes into play. The Zooniverse development team has recently created ZooTools as a place where volunteers can observe, collect, and analyze data from Zooniverse citizen science projects. These tools were initially conceived as a toolkit for adult volunteers to use to make discoveries within Zooniverse data but it is becoming apparent that these would also have useful applications in formal education settings. It’s worth pointing out that these tools are currently in beta. In the world of web development beta basically means “it ain’t perfect yet.”  ZooTools is not polished and perfect; in fact it’s possible you may encounter some bugs.

Projects like Galaxy Zoo and Planet Hunters have an impressive history of “extra credit” discoveries made by volunteers.  Galaxy Zoo volunteers have made major contributions to the astronomy literature through the discovery of the green peas galaxies and Hanny’s Voorwerp .  In Planet Hunters volunteers use Talk to share methods of exploring and results from the project’s light curves.  ZooTools lowers the barrier of entry by equipping volunteers with some simple tools to look for interesting relationships and results contained within the data.  No specialist knowledge required.

We’ve only begun thinking about how ZooTools could be used in the classroom.  I started my own investigation with a question that came from a Zooniverse classroom visit from last spring.  While making observations as a class about some of the amazing animals in Snapshot Serengeti one young man asked about civets. He wanted to know If they were nocturnal. We had an interesting discussion about how you could find out this information.  The general consensus was to Google it or look it up on Wikipedia.  I wondered if you could use the data contained within Snapshot Serengeti to come up with a reasonable answer.  I was excited to roll-up my sleeves and figure out how to use these tools to find a likely answer.  Here are the steps I took…

Step 1: Log-in to Zooniverse and go to ZooTools.

Step 1

Step 2: Select a project. Currently only have a few projects have data available to explore using ZooTools.

Step 2

Step 3: Create a dashboard.

Step 3

Step 4: Name your dashboard something awesome. I called mine Civets! for obvious reasons.

Step 4

Step 5: This is your blank dashboard.

Step 5

Step 6: It’s time to select a data source. I selected Snapshot Serengeti.

Step 6

Step 7: This is the data source.

Step 7

Step 8: I wanted to be able to filter my data so I selected Filter under search type. The name of this dataset in Snapshot Serengeti 1.

Step 8

Step 9: Since I wanted to look at civets, I selected that on the species dropdown menu and then clicked Load Data. My dataset will only contain images that Snapshot Serengeti volunteers identified as civets.

Step 9

Step 10: I had my data; next it was time to select a Tool.  I selected Tools at the top of the page.

Step 10

Step 11: I selected Subject Viewer because this tool allows my to flip through different images.

Step 11

Step 12: Next I had to connect my data source to my tool. From the Data Source drop down menu I selected Snapshot Serengeti 1.

Step 12

Step 13: In order to get a good luck at the images in my dataset I clicked the icon shaped like a fork to close the pane.  I then used the arrows to advance through the images.

Step 13

I flipped through the images and kept track of the night versus day. Of the 37 images in my dataset, I observed that 34 were taken at night and 3 were taken during the day.  This led me to the conclusion that civets are likely nocturnal.  This was so much more satisfying than just going to Google or Wikipedia. A couple of other questions that I explored…

What is the distribution of animals identified at one camera trap site?

14

 

How many honeybadgers have been observed by Snapshot Serengeti volunteers across different camera traps?

Screen Shot 2013-11-26 at 3.17.28 PM

Of course this is just the tip of the iceberg.  Currently you can explore Galaxy Zoo, Space Warps, and Snapshot Serengeti data using ZooTools. Currently you can use ZooTools to explore data from Galaxy Zoo, Space Warps, and Snapshot Serengeti.  The specific tools and datasets available vary from project to project.  In Galaxy Zoo for example you can look at data from Galaxy Zoo classifications or from SDSS Skyserver. Hopefully you’ll be inspired to have a play with these tools!  What questions would you or your students like to explore?

Our Elusive Milky Way

In the coming months the Zooniverse Education Blog will feature guest posts from participants in the Zooniverse Teacher Ambassadors Workshop. Today’s guest blogger William H. Waller is author of The Milky Way — An Insider’s Guide and co-editor of The Galactic Inquirer — an e-journal and forum on the topics of galactic and extragalactic astronomy, cosmochemistry and astrobiology, and interstellar communications.  Bill’s day job involves teaching courses in physics and astronomy at Rockport High School.

For most of human history, the night sky demanded our attention.  The shape-shifting Moon, wandering planets, pointillist stars, and occasional comet enchanted our sensibilities while inspiring diverse tales of origin.  The Milky Way, in particular, exerted a powerful presence on our distant ancestors.  Rippling across the firmament, this irregular band of ghostly light evoked myriad myths of life and death among the stars.  In 1609, Galileo Galilei pointed his telescope heavenward and discovered that the Milky Way is “nothing but a congeries of innumerable stars grouped together in clusters.”  Fast forward 400 years to the present day, and we find that the Milky Way has all but disappeared from our collective consciousness.  Where did it go?

For 25 years as an astronomy educator, I have informally polled hundreds of students, teachers, and the general public regarding their awareness of the night sky.  Invariably, no more than 25 percent have ever seen the Milky Way with their own eyes.  For city dwellers, this is completely understandable.  Unless properly shielded, the artificial lighting from municipal, commercial, and residential sources will spill into the sky and overwhelm the diffuse band of luminescence that is the hallmark of our home galaxy.  The recent video “The City Dark” produced by POV underscores the disruptive aspects that artificial lighting can produce on the life cycles of certain animals – and even upon ourselves.

View from Goodwood, Ontario before and after a power blackout (Courtesy Todd Carlson)
View from Goodwood, Ontario before and after a power blackout (Courtesy Todd Carlson

For residents of small towns well away from large cities (such as my own hometown of Rockport, MA), it is much easier to find dark “sanctuaries” where the Milky Way can be spied in all its exquisite beauty.  Yet when I poll Rockport’s sundry inhabitants about having ever seen the Milky Way, I still get a measly 25% positive response.  What’s going on here?

Is it that they don’t care about astronomy and the night sky?  I would have to say that such astronomical indifference is not typical.  Most people in conversations with me will volunteer their fascination for the planets, stars, and the exotica that our universe provides in abundance – from exoplanets to pulsars, black holes, dark matter, and dark energy.  Images from our great space telescopes have also revealed to the casual viewer many marvels of the Milky Way Galaxy, other nearby galaxies, and the remote galaxian cosmos.  Recently, stunning composite images of X-ray, visible, and infrared emission from regions of cosmic tumult have vivified the many powerful dramas that continue to unfold upon the galactic stage.

Supernova remnant Cassiopeia A, as observed 325 years after a massive star exploded.   (X-ray: blue), (Visible: green), (Infrared: red) – NASA
Supernova remnant Cassiopeia A, as observed 325 years after a massive star exploded.
(X-ray: blue), (Visible: green), (Infrared: red) – NASA

Yet, despite popular enthusiasm for the wonders of space, most people still do not bother to find a dark site and witness the source of these wonders for themselves.  Otherwise, my informal polling would have indicated that they knew about the Milky Way as a naked-eye marvel.  I suppose it comes down to the delivery of experiences.  We have grown accustomed to having our experiences conveyed to us in familiar, safe, and readily-accessible packages – be they books, magazines, television programs, planetarium shows, or interactive websites.

Regarding the latter, consider the Zooniverse online portal where anybody with an internet connection can contribute to authentic scientific research.  With just your eyes and hands, you can search for exoplanets around distant suns, trace out star-blown bubbles in our galaxy’s interstellar medium, and categorize the types of galaxies that dwell in deep space.  To date, close to a million people have contributed to  these and sundry other online scientific investigations.

Then there are the mobile apps.  One popular type of app, in particular, has brought millions more people closer to the night sky.  Google Sky Map, Droid Sky View, The Night Sky, and other interactive planetarium simulators enable a smartphone user to point the phone in any direction and see what stars and constellations are located there.  Most of these simulators show the Milky Way as a hazy band, thus cueing the viewer to its existence.  But does that mean that more people are making the effort to find dark sites for smartphone-aided star gazing?  Is participation in amateur astronomy clubs on the rise as a result?  And are star parties at our national parks surging with attendees?  My very limited research on these questions suggests that – yes – ever more people are seeking the sublime wonders of dark skies.  Whether such interactive apps are responsible for these trends remains unknown.  Still, I remain optimistic.

Perhaps our electronic addictions and virtual realities will ultimately re-introduce ourselves to the unembellished Milky Way – and to other direct experiences that Nature so generously provides.  We may be plugged-in as never before, but still we hunger for authentic interactions with the mysterious ways of Nature.  Towards these ends, I urge that we re-double our efforts to preserve the dark night sky through the advocacy of properly-shielded lighting and the establishment of dark-sky sanctuaries.  To help in these regards, please visit the International Dark Sky Society’s webpage.

Zoo Tools: A New Way to Analyze, View and Share Data

Since the very first days of Galaxy Zoo, our projects have seen amazing contributions from volunteers who have gone beyond the main classification tasks. Many of these examples have led to scientific publications, including Hanny’s Voorwerp, the ‘green pea’ galaxies, and the circumbinary planet PH1b.

One common thread that runs through the many positive experiences we’ve had with the volunteers is the way in which they’ve interacted more deeply with the data. In Galaxy Zoo, much of this has been enabled by linking to the Sloan SkyServer website, where you can find huge amounts of additional information about galaxies on the site (redshift, spectra, magnitudes, etc). We’ve put in similar links on other projects now, linking to the Kepler database on Planet Hunters, or data on the location and water conditions in Seafloor Explorer.

The second part of this that we think is really important, however, is providing ways in which users can actually use and manipulate this data. Some users have been already been very resourceful in developing their own analysis tools for Zooniverse projects, or have done lots of offline work pulling data into Excel, IDL, Python, and lots of other programs (see examples here and here). We want to make using the data easier and available to more of our community, which has led to the development of Zoo Tools (http://tools.zooniverse.org). Zoo Tools is still undergoing some development, but we’d like to start by describing what it can do and what sort of data is available.

An Example

Zoo Tools works in an environment which we call the Dashboard – each Dashboard can be thought of as a separate project that you’re working on. You can create new Dashboards yourself, or work collaboratively with other people on the same Dashboard by sharing the URL.

Zoo Tools Main Page

Create a New Dashboard

Within the Dashboard, there are two main functions: selecting/importing data, and then using tools to analyze the data.

The first step for working with the Dashboard is to select the data you’d like to analyze. At the top left of the screen, there’s a tab named “Data”. If you click on this, you’ll see the different databases that Zoo Tools can query. For Galaxy Zoo, for example, it can query the Zooniverse database itself (galaxies that are currently being classified by the project), or you can also analyze other galaxies from the SDSS via their Sky Server website.

Import Data from Zooniverse

Clicking on the “Zooniverse” button, for example, you can select galaxies in one of four ways: a Collection (either your own or someone else’s), looking at your recently classified galaxies, galaxies that you’ve favorited, or specific galaxies via their Zooniverse IDs. Selecting any of these will import them as a dataset, which you can start to look at and analyze. In this example we’ll import 20 recent galaxies.

Import 20 Recents

After importing your dataset, you can use any of the tools in Dashboard (which you can select under “Tools” at the top of the page) on your data. After selecting a tool, you choose the dataset that you’d like to work with from a dropdown menu, and then you can begin using it. For example: if I want to look at the locations of my galaxies on the sky, I can select the “Map” tool. I then select the data source I’d like to plot (in this case, “Zooniverse–1”) and the tool plots the coordinates of each galaxy on a map of the sky. I can select different wavelength options for the background (visible light, infrared, radio, etc), and could potentially use this to analyze whether my galaxies are likely to have more stars nearby based on their position with respect to the Milky Way.

The other really useful part is that the tools can talk to each other, and can pass data back and forth. For example: you could import a collection of galaxies and look at their colour in a scatterplot. You could then select only certain galaxies in that tool, and then plot the positions of those galaxies on the map. This is what we do in the screenshots below:

This slideshow requires JavaScript.

Making Data Analysis Social

You can also share Dashboards with other people. From the Zoo Tools home page you can access your existing dashboards as well as delete them and share them with others. You can share on Twitter and Facebook or just grab the URL directly. For example, the Dashboard above can be found here – with a few more tools added as a demonstration.

Sharing a Dashboard

This means that once you have a Dashboard set up and ready to use, you can send it to somebody else to use too. Doing this will mean that they see the same tools in the same configuration, but on their own account. They can then either replicate or verify your work – or branch off and use what you were doing as a springboard for something new.

What ‘Tools’ Are There?

Currently, there are eight tools available for both regular Galaxy Zoo and the Galaxy Zoo Quench projects:

  • Histogram: makes bar charts of a single data parameter
  • Scatterplot: plot any two data parameters against each other
  • Map: plot the position of objects on the sky, overplotted on maps of the sky at different wavelengths (radio, visible, X-ray, etc.)
  • Statistics: compute some of the most common statistics on your data (eg, mean, minimum, maximum, etc).
  • Subject viewer: examine individual objects, including both the image and all the metadata associated with that object
  • Spectra: for galaxies in the SDSS with a spectrum, download and examine the spectrum.
  • Table: List the metadata for all objects in a dataset. You can also use this tool to create new columns from the data that exists – for example, take the difference between magnitudes to define the color of a galaxy.
  • Color-magnitude: look at how the color and magnitude of galaxies compare to the total population of Galaxy Zoo. A really nice way of visualizing and analyzing how unusual a particular galaxy might be.

We have one tool up and running for Space Warps called Space Warp Viewer. This lets users adjust the color and scale parameters of image to examine potential gravitational lenses in more detail.

Snapshot Serengeti Dashboard

Finally, Snapshot Serengeti has several of the same tools that Galaxy Zoo does, including Statistics, Subject Viewer, Table, and Histogram (aka Bar Graph). There’s also Image Gallery, where you can examine the still images from your datasets, and we’re working on an Image Player. There’s a few very cool and advanced tools we started developing last week – they’re not yet deployed, but we’re really excited to let you follow the activity over many seasons or by focusing on particular cameras. Stay tuned. You can see an example Serengeti Dashboard, showing the distribution of Cheetahs, here (it’s also shown in the screenshot above).

We hope that Zoo Tools will be an important part of all Zooniverse projects in the future, and we’re looking forward to you trying them out. More to come soon!