Tag Archives: citizen science

New Project: Plankton Portal

It’s always great to launch a new project! Plankton Portal allows you to explore the open ocean from the comfort of your own home. You can dive hundreds of feet deep, and observe the unperturbed ocean and the myriad animals that inhabit the earth’s last frontier.

Plankton Portal Screenshot

The goal of the site is to classify underwater images in order to study plankton. We’ve teamed up with researchers at the University of Miami and Oregon State University who want to understand the distribution and behaviour of plankton in the open ocean.

The site shows you one of millions of plankton images taken by the In Situ Ichthyoplankton Imaging System (ISIIS), a unique underwater robot engineered at the University of Miami. ISIIS operates as an ocean scanner that casts the shadow of tiny and transparent oceanic creatures onto a very high resolution digital sensor at very high frequency. So far, ISIIS has been used in several oceans around the world to detect the presence of larval fish, small crustaceans and jellyfish in ways never before possible. This new technology can help answer important questions ranging from how do plankton disperse, interact and survive in the marine environment, to predicting the physical and biological factors could influence the plankton community.

The dataset used for Plankton Portal comes a period of just three days in Fall 2010. In three days, they collected so much data that would take more than three years to analyze it themselves. That’s why they need your help! A computer will probably be able to tell the difference between major classes of organisms, such as a shrimp versus a jellyfish, but to distinguish different species within an order or family, that is still best done by the human eye.

If you want to help, you can visit http://www.planktonportal.org. A field guide is provided, and there is a simple tutorial. The science team will be on Plankton Portal Talk to answer any questions, and the project is also on Twitter, Facebook and Google+.

Zooniverse: Live

Yesterday we pushed Zooniverse Live to be… er… live. Zooniverse Live is a constantly updated screen, showing live updates from most of our projects. You’ll see a map displaying the location of recent Zooniverse volunteer’s classifications and a fast-moving list of recently classified images. Zooniverse Live is on display in our Chicago and Oxford offices, but we thought it would be cool to share it with everyone.

At the time this screenshot was taken, the USA was very active and Snapshot Serengeti was busy.
At the time this screenshot was taken, the USA was very active and Snapshot Serengeti was busy.

The Zooniverse is a very busy place these days and we’ve been looking for ways to visualize activity across all the projects. Zooniverse Live is a fairly simple web application. Its backend is written in Clojure (pronounced Closure) and the front end is written in JavaScript using a library for data visualization called D3. The Zooniverse Live server listens to a stream of classification information provided by the Zooniverse projects – via a database technology called Redis. Zooniverse Live then updates its own internal database of classifications on the backend, with the front end periodically asking for updates.

The secret sauce is figuring out where users are classifying from. Zooniverse Live does that using IP Addresses. Everyone connected to the internet is assigned an IP Address by their Internet Service Provider (ISP). While the IP address assigned may change each time a computer connects to the internet, each address is unique and can be tied to a rough geographical area. When Zooniverse projects send their classifications to Zooniverse Live, they include the IP Address the user was classifying from, letting Zooniverse Live do a lookup for the user’s location to plot on the map. The locations obtained in this way are approximate, and in most cases represent your local Internet exchange.

Hopefully you’ll enjoy having a look at Zooniverse Live, and we’d love to hear ideas for other Zooniverse data visualizations you’d like to see.

Chasing Storms Online with the New Cyclone Center

Cyclone Center has recorded almost 250,000 classifications from volunteers around the world since its launch in September 2012. We’ve had lots of feedback on the project and have recently made significant changes that we think will make the experience of classifying storms more rewarding.

Patterns in storm imagery are best recognized by the human eye, so the scientists behind Cyclone Center are asking you to help look through 30 years of images of tropical storms. The end product will be a new global tropical cyclone dataset that could not be realistically obtained in any other fashion. We have already found that the pattern matching by our classifiers is doing better in many cases than a computer algorithm on the same images – this is very exciting!

The biggest change to the site is that we’re now targeting storms for classification. We’ve shifted to a system where the whole community will work on particular storms until they’re finished. This produces useful data very quickly – and means we can classify timely and scientifically useful storms as needed. These targeted storms will change frequently as you help us complete each one. You can check a box on the Cyclone Center home page that will mean you get alerted when new targeted storms appear: we hope to recruit a horde of enthusiastic online storm chasers this way.

Cyclone Centre Homepage

We’ve added much more inline classification guidance – gone are the days of clicking on question marks to get help.  For each step in the process, you will be shown information on how to best answer the question. We think this will give you more confidence in what you are doing and hopefully inspire you to do even more!

We’ve improved the tutorial and we’re providing more feedback as you go along – now instead of waiting for several images to see the “Storm Stats” page, you will immediately go there after your first image. We’ve also upgraded Cyclone Center Talk, which allows for better searching and highlights more of the interesting discussions going on between other citizen scientists.

All-in-all it’s a big change for an awesome project. Log in to Cyclone Center today and give the new version a try. Don’t forget to check the box to start getting alerted to new storms as they appear: this will be incredibly useful for the research behind the site, and means you can be the first to classify data on new storms.

[Visit http://www.cyclonecenter.org and see the blog at http://blog.cyclonecenter.org]

Zoo Tools: A New Way to Analyze, View and Share Data

Since the very first days of Galaxy Zoo, our projects have seen amazing contributions from volunteers who have gone beyond the main classification tasks. Many of these examples have led to scientific publications, including Hanny’s Voorwerp, the ‘green pea’ galaxies, and the circumbinary planet PH1b.

One common thread that runs through the many positive experiences we’ve had with the volunteers is the way in which they’ve interacted more deeply with the data. In Galaxy Zoo, much of this has been enabled by linking to the Sloan SkyServer website, where you can find huge amounts of additional information about galaxies on the site (redshift, spectra, magnitudes, etc). We’ve put in similar links on other projects now, linking to the Kepler database on Planet Hunters, or data on the location and water conditions in Seafloor Explorer.

The second part of this that we think is really important, however, is providing ways in which users can actually use and manipulate this data. Some users have been already been very resourceful in developing their own analysis tools for Zooniverse projects, or have done lots of offline work pulling data into Excel, IDL, Python, and lots of other programs (see examples here and here). We want to make using the data easier and available to more of our community, which has led to the development of Zoo Tools (http://tools.zooniverse.org). Zoo Tools is still undergoing some development, but we’d like to start by describing what it can do and what sort of data is available.

An Example

Zoo Tools works in an environment which we call the Dashboard – each Dashboard can be thought of as a separate project that you’re working on. You can create new Dashboards yourself, or work collaboratively with other people on the same Dashboard by sharing the URL.

Zoo Tools Main Page

Create a New Dashboard

Within the Dashboard, there are two main functions: selecting/importing data, and then using tools to analyze the data.

The first step for working with the Dashboard is to select the data you’d like to analyze. At the top left of the screen, there’s a tab named “Data”. If you click on this, you’ll see the different databases that Zoo Tools can query. For Galaxy Zoo, for example, it can query the Zooniverse database itself (galaxies that are currently being classified by the project), or you can also analyze other galaxies from the SDSS via their Sky Server website.

Import Data from Zooniverse

Clicking on the “Zooniverse” button, for example, you can select galaxies in one of four ways: a Collection (either your own or someone else’s), looking at your recently classified galaxies, galaxies that you’ve favorited, or specific galaxies via their Zooniverse IDs. Selecting any of these will import them as a dataset, which you can start to look at and analyze. In this example we’ll import 20 recent galaxies.

Import 20 Recents

After importing your dataset, you can use any of the tools in Dashboard (which you can select under “Tools” at the top of the page) on your data. After selecting a tool, you choose the dataset that you’d like to work with from a dropdown menu, and then you can begin using it. For example: if I want to look at the locations of my galaxies on the sky, I can select the “Map” tool. I then select the data source I’d like to plot (in this case, “Zooniverse–1”) and the tool plots the coordinates of each galaxy on a map of the sky. I can select different wavelength options for the background (visible light, infrared, radio, etc), and could potentially use this to analyze whether my galaxies are likely to have more stars nearby based on their position with respect to the Milky Way.

The other really useful part is that the tools can talk to each other, and can pass data back and forth. For example: you could import a collection of galaxies and look at their colour in a scatterplot. You could then select only certain galaxies in that tool, and then plot the positions of those galaxies on the map. This is what we do in the screenshots below:

This slideshow requires JavaScript.

Making Data Analysis Social

You can also share Dashboards with other people. From the Zoo Tools home page you can access your existing dashboards as well as delete them and share them with others. You can share on Twitter and Facebook or just grab the URL directly. For example, the Dashboard above can be found here – with a few more tools added as a demonstration.

Sharing a Dashboard

This means that once you have a Dashboard set up and ready to use, you can send it to somebody else to use too. Doing this will mean that they see the same tools in the same configuration, but on their own account. They can then either replicate or verify your work – or branch off and use what you were doing as a springboard for something new.

What ‘Tools’ Are There?

Currently, there are eight tools available for both regular Galaxy Zoo and the Galaxy Zoo Quench projects:

  • Histogram: makes bar charts of a single data parameter
  • Scatterplot: plot any two data parameters against each other
  • Map: plot the position of objects on the sky, overplotted on maps of the sky at different wavelengths (radio, visible, X-ray, etc.)
  • Statistics: compute some of the most common statistics on your data (eg, mean, minimum, maximum, etc).
  • Subject viewer: examine individual objects, including both the image and all the metadata associated with that object
  • Spectra: for galaxies in the SDSS with a spectrum, download and examine the spectrum.
  • Table: List the metadata for all objects in a dataset. You can also use this tool to create new columns from the data that exists – for example, take the difference between magnitudes to define the color of a galaxy.
  • Color-magnitude: look at how the color and magnitude of galaxies compare to the total population of Galaxy Zoo. A really nice way of visualizing and analyzing how unusual a particular galaxy might be.

We have one tool up and running for Space Warps called Space Warp Viewer. This lets users adjust the color and scale parameters of image to examine potential gravitational lenses in more detail.

Snapshot Serengeti Dashboard

Finally, Snapshot Serengeti has several of the same tools that Galaxy Zoo does, including Statistics, Subject Viewer, Table, and Histogram (aka Bar Graph). There’s also Image Gallery, where you can examine the still images from your datasets, and we’re working on an Image Player. There’s a few very cool and advanced tools we started developing last week – they’re not yet deployed, but we’re really excited to let you follow the activity over many seasons or by focusing on particular cameras. Stay tuned. You can see an example Serengeti Dashboard, showing the distribution of Cheetahs, here (it’s also shown in the screenshot above).

We hope that Zoo Tools will be an important part of all Zooniverse projects in the future, and we’re looking forward to you trying them out. More to come soon!

Welcome to the Worm Watch Lab

Today we launch a new Zooniverse project in association with the Medical Research Council (MRC) and the Medical Research Foundation: Worm Watch Lab.

We need the public’s help in observing the behaviour of tiny nematode worms. When you classify on wormwatchlab.org you’re shown a video of a worm wriggling around. The aim of the game is to watch and wait for the worm to lay eggs, and to hit the ‘z’ key when they do. It’s very simple and strangely addictive. By watching these worms lay eggs, you’re helping to collect valuable data about genetics that will assist medical research.

Worm Watch Lab

The MRC have built tracking microscopes to record these videos of crawling worms. A USB microscope is mounted on a motorised stage connected to a computer. When the worm moves, the computer analyses the changing image and commands the stage to move to re-centre the worm in the field of view. Because the trackers work without supervision, they can run eight of them in parallel to collect a lot of video! It’s these movies that we need the public to help classify.

By watching movies of the nematode worms, we can understand how the brain works and how genes affect behaviour. The idea is that if a gene is involved in a visible behaviour, then mutations that break that gene might lead to detectable behavioural changes. The type of change gives us a hint about what the affected gene might be doing. Although it is small and has far fewer cells than we do, the worm used in these studies (called C. elegans) has almost as many genes as we do! We share a common ancestor with these worms, so many of their genes are closely related to human genes. This presents us with the opportunity to study the function of genes that are important for human brain function in an animal that is easier to handle, great for microscopy and genetics, and has a generation time of only a few days. It’s all quite amazing!

To get started visit www.wormwatchlab.org and follow the tutorial. You can also find Worm Watch Lab on Facebook and on Twitter.

52 Years of Human Effort

At ZooCon last week I spoke about the scale of human attention that the Zooniverse receives. One of my favourite stats in this realm (from Clay Shirky’s book ‘Cognitive Surplus’) is that in the USA, adults cumulatively spend about 200 billion hours watching TV every year. By contrast it took 100 million hours of combined effort for Wikipedia to reach its status as the world’s encyclopaedia.

In the previous year people collectively spent just shy of half a million hours working on Zooniverse projects. Better put, the community invested about 52 years worth of effort[1]. That’s to say that if an individual sat down and did nothing but classify on Zooniverse sites for 52 years they’d only just have done the same amount of work as our community did between June 2012 and June 2013. The number is always rising too. Citizen science is amazing!

Another way of thinking about it is to convert this time into Full Time Equivalents (FTEs). One person working 40 hours per week, for 50 weeks a year works for 2000 hours a year – that’s 1 FTE. So our 460,000 hours of Zooniverse effort are equivalent to 230 FTEs. It’s as if we had a building with 230 people in who only came in every day to click on Zooniverse projects.

Zooniverse Effort Distribution to June 2013

This amazing investment by the community is not broken down evenly of course, as the above ‘snail’ chart shows. In fact Planet Hunters alone would occupy 62 of the people in our fictional building: the project took up 27% of the effort in the last year. Galaxy Zoo took 17%, which means it had almost 9 years of your effort all to itself. Planet Four had a meteoric launch on the BBC’s Stargazing Live less than six months ago and since that time it has gobbled up just over 5 years of human attention – 10% of the whole for the past year.

What’s wonderful is that our 230 metaphorical workers, and the 52 years they represent, are not confined to one building or one crazed click-worker. Our community is made of hundreds of thousands of individuals across the world – 850,000 of whom have signed up through zooniverse.org. Some of them have contributed a single classification, others have given our researchers far, far more of their time and attention. Through clicking on our sites, discussing ideas on Talk, or just spreading the word: Zooniverse volunteers are making a significant contribution to research in areas from astronomy to zoology.

Congratulations to everyone who’s taken part and let’s hope this number increases again by next year!

[1] In my ZooCon talk I incorrectly gave the figure of 35 years. This was wrong for two reasons; firstly, I had neglected Andromeda Project, Planet Four and Snapshot Serengeti for technical reasons. Secondly I had calculated the numbers incorrectly, in my rush to get my slides ready, and I underestimated them all by about 20%.

Got An Idea for a Zooniverse Project? Propose One

For more than a year, we’ve been openly accepting proposals for new Zooniverse projects and this has brought to life projects such as Seafloor Explorer, Snapshot Serengeti, Notes from Nature and Space Warps.

Yesterday, five Zooniverse projects were featured in The Biologist’s 10 Great Citizen Science Projects – several of them were ideas proposed by researchers we had never met before they came to us and said ‘hey, I have a cool idea for a project‘. We’ve also recently seen articles about how the Zooniverse might be able to help in a crisis and how we provide an excellent avenue for proactive procrastination. Citizen science projects are wide and varied and lots of researchers have great ideas.

So this is a good time to remind everyone that we want to hear from researchers with ideas for Zooniverse projects. If that’s you: propose a project! We have funding from the Alfred P. Sloan Foundation to build your great ideas and work with you to further science. We also have an incredibly talented team of designers, developers, educators and researchers who want to make your idea into an awesome new Zooniverse project.

If you want to know more about this, you can get in touch with any of the team or via our general email address or on Twitter @the_zooniverse. We’re currently working on projects that were proposed earlier this year and we’ll be announcing them soon. Maybe yours will be next?