Asteroid Zoo Paused

The AsteroidZoo community has exhausted the data that are available at this time. With all the data examined we are going to pause the experiment, and before users spend more time we want to make sure that we can process your finds through the Minor Planet Center and get highly reliable results.

We understand that it’s frustrating when you’ve put in a lot of work, and there isn’t a way to confirm how well you’ve done. But please keep in mind that this was an experiment – How well do humans find asteroids that machines cannot?

Often times in science an experiment can run into dead-ends, or speed-bumps; this is just the nature of science. There is no question that the AsteroidZoo community has found several potential asteroid candidates that machines and algorithms simply missed. However, the conversion of these tantalizing candidates into valid results has encountered a speed bump.

What’s been difficult is that all the processing to make an asteroid find “real” has been based on the precision of a machine – for example the arc of an asteroid must be the correct shape to a tiny fraction of a pixel to be accepted as a good measurement. The usual process of achieving such great precision is hands-on, and might take takes several humans weeks to get right. On AsteroidZoo, given the large scale of the data, automating the process of going from clicks to precise trajectories has been the challenge.

While we are paused, there will be updates to both the analysis process, and the process of confirming results with the Minor Planet Center. Updates will be posted as they become available.

https://talk.asteroidzoo.org/
http://reporting.asteroidzoo.org/

Thank you for your time.

Science Learning via Participation in Online Citizen Science

My name is Dr. Karen Masters, and I’m an astronomer working at the University of Portsmouth. My main involvement with the Zooniverse over the last 8 years or so has been through my research into galaxy evolution making use of the Galaxy Zoo classifications (see Zooniverse Publication list), and as the Project Scientist for Galaxy Zoo I enjoy organizing science team telecons, and research meetings. I’ve also written many blog posts about galaxy evolution for the Galaxy Zoo blog.

Being involved in Galaxy Zoo has opened many interesting doors for me. I have always had a keen interest in science communication and science education. In fact, working with Galaxy Zoo has been a real pleasure because of the way it blurred the lines between astronomical research and public engagement.

A couple of years ago I was given the opportunity to get more formally engaged in researching how Galaxy Zoo (and other Zooniverse projects) contribute to science communication/education. A colleague of mine in the Portsmouth Business School, who is an expert in the economics of volunteering, led a team (which I was part of) which was successful in obtaining funding for a 3 year project to study the motivations of citizen scientists, including how scientific learning contributes to the motivations.  We call our project VOLCROWE.

The VOLCROWE survey, which ran in late March/early April of last year included a science quiz, which tested both general science knowledge, and knowledge specific to five different surveys. This meant that the data collected could be used to investigate, in a statistical sense, how much you are learning about scientific content while classifying on Zooniverse projects.

We collected complete responses to the survey from almost 2000 Zooniverse volunteers spread across Galaxy Zoo, Planet Hunters, Penguin Watch, Seafloor Explorer and Snapshot Serengeti.

The survey respondents certainly believed they were learning about science through their participation. When asked if they Zooniverse (i) lets them learn through direct hands on experience of scientific research; (ii) allows them to gain a new perspective on scientific research; or (iii) helps them learn about science, and overwhelming majority (more than 80% in all cases) agreed, or strongly agreed.

Masters_Figure5
Responses to questions about if the volunteers agreed that the Zooniverse…..

We were also able to find evidence in the survey responses that project specific science knowledge correlated positively with measures of active engagement in the project. Put plainly, people who classified more on a given project we found to know more about the scientific content of that project. We could use the scores from the general science quiz as a measure of unrelated scientific knowledge (which did not correlate with how much people classified) to claim that this correlation is causal – i.e. people are learning more about the science behind our projects the more time they spend classifying.

A different VOLCROWE publication, “How is success defined and measured in online citizen science? A case study of Zooniverse projects”, Cox et al. (2015), measured the success of Zooniverse projects in different metrics. In that work we demonstrated that projects could be scientifically successful (i.e. contribute to increased scientific output) without being very successful in public engagement. However, public engagement success without good scientific output was not found in any of the Zooniverse projects studied in Cox et al. (2015).  Four of our five projects in our Science Learning study were part of Cox et al. (2015;  Penguin Watch hadn’t launched at that time) and in Masters et al. (2016) we were able to show that the better projects did in public engagement success metrics, in general the stronger the correlation we found between scientific knowledge and time spent classifying. This does not seem too surprising, but it’s nice to show with data.

We concluded thus:

“Our results imply that even for citizen science project designed primarily to meet the research goals of a science team, volunteers are learning about scientific topics while participating. Combined with previous work (Cox et al. 2015) that suggested it is difficult for projects to be successful at public engagement without being scientifically successful (but not vice versa) this has implications for future design of citizen science projects, even those primarily motivated by public engagement aims. While scientific success will not alone lead to scientific learning among the user community, we argue that these works together demonstrate scientific success is a necessary (if not a sufficient) requirement for successful and sustainable public engagement through citizen science. We conclude that the best way to use citizen science projects to provide an environment that facilitates science learning is to provide an authentic science driven project, rather than to develop projects with solely educational aims.”

As you may know, authenticity is at the heart of the Zooniverse Philosophy, so it was really nice to find this evidence which backs that up. You know you can trust Zooniverse projects to make use of your classifications to make contributions to the sum of knowledge of humankind.

I also had great fun writing this up for publication, a process which involved me learning a great deal about what is meant by “Science Learning” in the context of research into science communication.

It was published today in the Journal of Science Communication, Special Edition in Citizen Science (Part II). You can also read the paper in full in the open access archive at: https://arxiv.org/abs/1601.05973.

What is Penguin Watch 2.0?

We’re getting through the first round of Penguin Watch data- it’s amazing and it’s doing the job we wanted, which is to revolutionise the collection and processing of penguin data from the Southern Ocean – to disentangle the threats of climate change, fishing and direct human disturbance. The data are clearly excellent, but we’re now trying to automate processing them so that results can more rapidly influence policy.

In “PenguinWatch 2.0”, people will be able to see the results of their online efforts to monitor and conserve Antarctica’s penguins colonies. The more alert among you will notice that it’s not fully there yet, but we’re working on it!

We have loads of ideas on how to integrate this with the penguinwatch.org experience so that people are more engaged, learn more and realise what they are contributing to!

unnamed

For now, we’re doing this the old-fashioned way; anyone such as schools who want to be more engaged, can contact us (tom.hart@zoo.ox.ac.uk) and we’ll task you with a specific colony and feedback on that.

Primary School Zooniverse Volunteers

Recently my class of 8-9 year old kids from ZŠ Brno, Jihomoravské náměstí (a primary school in the Czech Republic) took part in several Zooniverse projects.

ysh_P1080256

First, they were just talking about their dreams – what they would like to achieve in life. Mostly, they wanted to become a sports star or music celebrity, but some actually considered becoming a scientist!

Then they were introduced to the Zooniverse and citizen science. Fascinated by the idea than they can actually contribute to real science (so someone’s dream can come true), they dived into the list of projects on the Zooniverse website. All the cover images and project names were really attractive to them, sadly, only two projects are available in Czech. Anyway, the first project they started – Snapshots at Sea – was in English only. This project focusing on marine animals, especially cetaceans, is very simple though. The only task is to say whether there are any animals present in the picture. They learned the English question very quickly and classified over 200 images on their own. They asked various questions about those fascinating animals and looked hungry for more answers. Initially, they didn’t want to stop classifying, but when they heard the name of the following project to try – Penguin Watch, they were totally into it!

ysh_P1080211

This project, available in Czech, shows wintery images of remote locations in Antarctica, usually crowded with nesting penguins. The tasks here are to mark adult penguins, chicks, or their eggs, and any predators, if present. They took turns marking, trying to mark at least 30 penguins as quickly as possible so they could see another image. They couldn’t wait to find an egg. And after only 9 images they succeeded!

They were curious about Antarctica, as well as about penguins. They wondered, why it is so cold there, and how are long polar days and nights likely to happen. Answering their last question would have been a great step to lead into trying a space project, as many of them are available on Zooniverse. But, they decided to try another wildlife project, Chimp & See, a project monitoring wild animals in Africa, especially chimpanzees and their behaviour. This project wasn’t as easy for them, as they were asked to identify unfamiliar animals in short video clips (they had to learn their names in English during classification) and then to describe their behaviour using a list of options. Surprisingly, they didn’t mind the language barrier much. After a short while, all of them were standing in front of the screen and everyone wanted to touch it! They seemed to be totally hooked.

ysh_P1080287

The researchers from Chimp&See were so kind to offer them the chance to choose a name for a currently unidentified juvenile chimp, captured on 4 different video sequences! The kids were really excited by such an opportunity and suggested a lot of names to choose from. They were voting in the end and all agreed on a single name – Kibu!

When the lesson ended, many of them asked to create their own accounts, so they could participate on their own from home. Next time, we are going to try Plankton Portal and Floating Forests.

Zooniverse projects are really a great opportunity for kids to learn about nature, they bring them to the real science, and not to forget, they are great fun!

 

__
By Zuzana Macháčková, a primary school teacher in Brno and Zooniverse volunteer.

Darren (DZM) New Horizons

Dear Zooniverse community,

I have some news to break to everyone. I’ve accepted a new position at a different company, and while it’s an extremely exciting opportunity for me, it does mean that I have to step away from the Community Builder role here.

This is a bittersweet announcement for me, because as exciting as my new job is for my career, I’ve truly loved my time at the Zooniverse, helping to grow this community and our platform and getting to know so many incredible volunteers, researchers, and staff.

However, I do want to emphasize that this is definitely not goodbye! I couldn’t possibly leave completely—there are so many projects here that I enjoy doing as much as you guys do, and so many exciting developments in the pipeline that I want to see pan out. I’m not going anywhere; instead, I’m becoming one of you: a Zooniverse volunteer. I won’t be your liaison anymore, or a source for reporting your needs, but I’ll continue to be your colleague in people-powered research.

The Zooniverse is growing and changing at an incredible rate right now, and has been for much of my time here over the past 14 months. Overall, I’m blown away by what you’ve all helped us to accomplish. Projects are being launched and completed quickly, and our new research teams are more attuned to volunteers’ needs than ever before. I’ve long believed that the launch of the Project Builder would begin a process of exponentially expanding the scope of the Zoo, and we are definitely beginning to see that happening. I can’t wait to find out, along with the rest of you, what the next chapter of this story has in store for us all.

Thank you all for everything, and I’ll be seeing you all around!

Yours in people-powered research,

Darren “DZM” McRoy

Special note from the ZooTeam — Thank you Darren for all your hard work over the years! We’re so excited for you and this new opportunity. And we very much look forward to continuing to build and strengthen the relationships between our volunteers, research teams, and the Zooniverse team. Thank you all for your contributions! Onward and upward.

The importance of acknowledgement

Trying to understand the vast proliferation of ‘citizen science’ projects is a Herculean task right now, with projects cropping up all over the place dealing with both online data analysis like that which concerns us here at the Zooniverse and with data collection and observation of the natural world via projects like iNaturalist. As the number of projects increases, so do questions about the effectiveness of these projects, and so does our desire to keep track of the impact all of the effort put into them is having.

These aren’t easy questions to answer, and an attempt to track the use of citizen science in the literature is made by Ria Follett and Vladimir Strezov, two researchers in the Department of Environmental Sciences at Macquarie University, in a recent paper published in the journal PLOS One. They look at papers including the words ‘citizen science’, and includes the surprising result that ‘online’ projects accounted for only 12% of their sample. They explain :

The missing articles dis- cussed discoveries generated using “galaxy zoo” data, rather than acknowledging the contribtions of the citizens who created this data.

This, to me, is pushing a definition to extremes. Every one of the ‘missing’ papers cited has a link to a list of volunteers who contributed; several have volunteers listed on the author list! To claim that we’re not ‘acknowledging the contribtions’ of volunteers because we don’t use the shibboleth ‘citizen science’ is ridiculous. Other Zooniverse projects, such as Planet Hunters, don’t even appear in the study for much the same reason, and it’s sad that a referee didn’t dig deeper into the limited methodology used in the article.

Part of the problem here is the age-old argument about the term ‘citizen science’. It’s not a description most of our volunteers would use of themselves, but rather a term imposed from the academy to describe (loosely!) the growing phenomenon of public participation in public research. In most of our Galaxy Zoo papers, we refer to ‘volunteers’ rather than ‘citizen scientists’ – and we believe strongly in acknowledging the contributions of everyone to a project, whatever term they choose to label themselves with.

Chris

Lost Classifications

We’re sorry to let you know that at 16:29 BST on Wednesday last week we made a change to the Panoptes code which had the unexpected result that it failed to record classifications on six of our newest projects; Season Spotter, Wildebeest Watch, Planet Four: Terrains, Whales as Individuals, Galaxy Zoo: Bar Lengths, and Fossil Finder. It was checked by two members of the team – unfortunately, neither of them caught the fact that it failed to post classifications back. When we did eventually catch it, we fixed it within 10 minutes. Things were back to normal by 20:13 BST on Thursday, though by that time each project had lost a day’s worth of classifications.

To prevent something like this happening in the future we are implementing new code that will monitor the incoming classifications from all projects and send us an alert if any of them go unusually quiet. We will also be putting in even more code checks that will catch any issues like this right away.

It is so important to all of us at the Zooniverse that we never waste the time of any of our volunteers, and that all of your clicks contribute towards the research goals of the project. If you were one of the people whose contributions were lost we would like to say how very sorry we are, and hope that you can forgive us for making this terrible mistake. We promise to do everything we can to make sure that nothing like this happens again, and we thank you for your continued support of the Zooniverse.

Sincerely,

The Zooniverse Team

One line at a time: A new approach to transcription and art history

Today, we launch AnnoTate, an art history and transcription project made in partnership with Tate museums and archives. AnnoTate was built with the average time-pressed user in mind, by which I mean the person who does not necessarily have five or ten minutes to spare, but maybe thirty or sixty seconds.

AnnoTate takes a novel approach to crowdsourced text transcription. The task you are invited to do is not a page, not sentences, but individual lines. If the kettle boils, the dog starts yowling or the children are screaming, you can contribute your one line and then go attend to life.

The new transcription system is powered by an algorithm that will show when lines are complete, so that people don’t replicate effort unnecessarily. As in other Zooniverse projects, each task (in this case, a line) is done by several people, so you’re not solely responsible for a line, and it’s ok if your lines aren’t perfect.

Of course, if you want trace the progression of an artist’s life and work through their letters, sketchbooks, journals, diaries and other personal papers, you can transcribe whole pages and documents in sequence. Biographies of the artists are also available, and there will be experts on Talk to answer questions.

Every transcription gets us closer to the goal of making these precious documents word searchable for scholars and art enthusiasts around the world. Help us understand the making of twentieth-century British art!

Get involved now at anno.tate.org.uk

Sunspotter Citizen Science Challenge Update: Zooniverse Volunteers Are Overachievers

An apology is owed to all Zooniverse volunteers; We incredibly underestimated the Zooniverse Community’s ability to mobilize for the Sunspotter Citizen Science Challenge. You blew our goal of 250,000 new classifications on Sunspotter in a week out of the water!  It took 16 hours to reach 250,000 classifications.  I’ll say that again, 16 hours!

By 20 hours you hit 350,000 classifications. That’s an 11,000% increase over the previous day. By the end of the weekend, the total count stood at over 640,000.

Let’s up the ante, shall we? Our new goal is a cool 1,000,000 classifications by Saturday September 5th.  That would increase the total number of classifications since Sunspotter launched in February 2014 by 50%!

Thank you all for contributing!

P.S. Check out the Basics of a Solar Flare Forecast on the Sunspotter blog from science team member Dr. Sophie Murray.

Sunspotter Citizen Science Challenge: 29th August – 6th September

Calling all Zooniverse volunteers!  As we transition from the dog days of summer to the pumpkin spice latte days of fall (well, in the Northern hemisphere at least) it’s time to mobilize and do science!

Sunspotter Citizen Science Challenge

Our Zooniverse community of over 1.3 million volunteers has the ability to focus efforts and get stuff done. Join us for the Sunspotter Citizen Science Challenge! From August 29th to September 5th, it’s a mad sprint to complete 250,000 classifications on Sunspotter.

Sunspotter needs your help so that we can better understand and predict how the Sun’s magnetic activity affects us on Earth. The Sunspotter science team has three primary goals:

  1. Hone a more accurate measure of sunspot group complexity
  2. Improve how well we are able to forecast solar activity
  3. Create a machine-learning algorithm based on your classifications to automate the ranking of sunspot group complexity
Classifying on Sunspotter
Classifying on Sunspotter

In order to achieve these goals, volunteers like you compare two sunspot group images taken by the Solar and Heliospheric Observatory and choose the one you think is more complex.  Sunspotter is what we refer to as a “popcorn project”.  This means you can jump right in to the project and that each classification is quick, about 1-3 seconds.

Let’s all roll up our sleeves and advance our knowledge of heliophysics!

Follow

Get every new post delivered to your Inbox.

Join 38,439 other followers