Into the Zooniverse: Vol II now available!

For the second year in a row, we’re honoring the hundreds of thousands of contributors, research teams, educators, Talk moderators, and more who make Zooniverse possible. This second edition of Into the Zooniverse highlights another 40 of the many projects that were active on the website and app in the 2019 – 20 academic year.

Image of Into the Zooniverse book

In that year, the Zooniverse has launched 65 projects, volunteers have submitted more than 85 million classifications, research teams have published 35 papers, and hundreds of thousands of people from around the world have taken part in real research. Wow!

To get your copy of Into the Zooniverse: Vol II, download a free pdf here or order a hard copy on Blurb.com. Note that the cost of the book covers production and shipping; Zooniverse does not receive profit through sales. According to the printer, printing and binding take 4-5 business days, then your order ships. To ensure that you receive your book before December holidays, you can use this tool to calculate shipping times.

Read more at zooniverse.org/about/highlights.

Fixed Cross-Site Scripting Vulnerability on Zoomapper App

On 9 November 2020, a security researcher notified us of a cross-site scripting (XSS) vulnerability on our zoomapper application. This service hosts tile sets that are used to render maps for a small number of other Zooniverse applications, but is not connected to any critical Zooniverse infrastructure. This XSS vulnerability could have allowed users to execute malicious code on the zoomapper application in the browser.

We were able to remediate the vulnerability within hours of the report by disabling the browser GUI for zoomapper (see PR #6). The GUI had been turned on by default for the zoomapper app, but is not necessary to fulfill the app’s intended role.

Additional notes on the incident:

  • The vulnerability existed since the app was first deployed on September 15th 2020.
  • The vulnerability was located in the underlying Tileserver-GL dependency.
  • No Zooniverse user or project data was vulnerable or exposed by this vulnerability.

We’d like to thank Rachit Verma (@b43kd00r) for bringing this issue to our attention and for following responsible disclosure by reporting it to us in private, as requested on our security page.

News from the Etchiverse – our first results!

Just over three years ago we launched the first Etch A Cell project (https://www.zooniverse.org/projects/h-spiers/etch-a-cell). The project was the first of its kind on the Zooniverse: never before had we asked volunteers to help draw around the small structures inside of cells (also known as ‘manual segmentation of organelles’) visualised with very high-powered electron microscopes. We even had to develop a new tool type on the Zooniverse to do this – a drawing tool for annotating images.

In this first Etch A Cell project, the organelle we asked Zooniverse volunteers to help examine was the nuclear envelope (as you can see shown in green in the image below). The nuclear envelope is a large membrane found within cells. It surrounds the nucleus, which is the part of the cell that contains the genetic material. It’s an important structure to study as it’s known to be involved in a number of diseases, including cancer, and it’s often the first structure research teams inspect in a new data set.

This gif shows an image of a cell taken with an electron microscope. This particular cell is a HeLa cell, a type of cancer cell that is widely used in scientific research. The segmented nuclear envelope is shown in green.

The results…

Earlier this year, we published the first set of results from this project. I’ve summarised some of our most exciting findings below, but if you’d like to take a look at the original paper, you can access it here (https://www.biorxiv.org/content/10.1101/2020.07.28.223024v1.full).

1. Zooniverse volunteers dedicated a huge amount of effort! Zooniverse volunteers submitted more than 100,000 segmentations across the 4000 images analysed in this first Etch A Cell project. Through this effort, the nuclear envelopes of 18 cells were segmented (shown below in green) from our original data block (shown below).

2. Volunteers were very good at segmenting the nuclear envelope. As you can see in the gif and images below, most classifications submitted for each image were really good! Manual segmentation isn’t an easy task to do, even for experts, so we were really impressed!

An unannotated image is shown on the left. The image on the right shows an overlay of all the volunteer segmentations received for this image. As you can see, most volunteers did a great job at segmenting the nuclear envelope.

3. There’s power in a crowd! The image below shows an overlay of every single segmentation for one of the nuclei studied in Etch A Cell. As you can see, through the collective effort of Zooniverse volunteers, something beautiful emerges – by overlaying everyone’s effort like this, you can see the shape of the nuclear envelope begin to appear!

To make sense of all of this data, we developed an analysis approach that took all of these lines and averaged them to form a ‘consensus segmentation’ for each nuclear envelope. This consensus segmentation, produced through the collective effort of volunteers, was incredibly similar to that produced by an expert microscopist. You can see this in the image below: on the left (in yellow) you can see the expert segmentation of the nuclear envelope of one cell compared to the volunteer segmentation (in green). The top image shows a single slice from the cell, the bottom image shows the 3D reconstruction of the whole nuclear envelope.

4. Volunteer segmentations can be used to train powerful new algorithms capable of segmenting the nuclear envelope. We found that volunteer data alone, with no expert data at all, could be used to train computer algorithms to perform the task of nuclear envelope segmentation to a very high standard. In the gif below you can see the computer predicted nuclear envelope segmentation for each of the cells in pink.

5. Our algorithm works surprisingly well on other data sets. We ran this new algorithm on other datasets that had been produced under slightly different experimental conditions. Because of these differences, we didn’t expect the algorithm to perform very well, however, as you can see in the images below, it did a very good job at identifying the location of the nuclear envelope. Because of this transferability, members of our research team have already begun using this algorithm to aid their new research projects.

The future…

We’re so excited to share these results with you, our volunteer community, and the research communities we collaborate with, and we’re looking forward to building on these findings in the future. The algorithms we’ve been able to produce from this effort are already being used by research teams at the Crick, and we’ve already launched multiple new projects asking for your help to look at other organelles – The Etchiverse is expanding!

You can access all our current Etch A Cell projects through the Etch A Cell Organisation page

Zooniverse Mobile App Release v2.8.2!

Now it’s even easier to contribute to science from your phone!

On any crowded public bus (before the pandemic), people sat next to each other, eyes fixed on their phones, smiling, swiping. 

What were they all doing? Using a dating app, maybe. Or maybe they were separating wildcam footage of empty desert from beautiful birds. Maybe they were spotting spiral arms on faraway galaxies.

Maybe one of them was you!  

We’ve loved seeing the participation in the Zooniverse through the mobile app (available for iOS and Android) over the past two years. So we made it even easier for you to do that wherever you swipe these days—a park bench, or maybe your home. (Please don’t swipe and drive). 

Right now, you can go into the app and contribute to Galaxy Zoo Mobile, Catalina Outer Solar System Survey, Disk Detective, Mapping Historic Skies, Nest Quest Go, or Planet Four: Ridges. And we have more projects on the way!

What’s new in the app

When you update to version 2.8.2, you’ll notice a slick new look. At the very top, there’s now an “All Projects” category. This will show you everything available for mobile—with the projects that need your help the most sorted at the very top! You can also still choose a specific discipline, of course.

That’s it for features that are totally new, but a lot of features in this version are fixed. No more crashing when you tap on browser projects. A lot fewer project-related crashes. Animated gifs, which previously worked only on iOS, now also work on Android—so researchers can show you an image that changes over time.  

What’s more—and you’ll never see this, but it’s important to us, the developers—we’ve made a lot of changes that help us keep improving the app. We have better crash reporting mechanisms and more complete automated testing. We also updated all of our documentation so that developers from outside our team can contribute to the app, too! We’d love to be a go-to open source project for people who are learning, or working in, React Native (the platform on which our app is built).

Aggregate Functionality

The full list of functionalities now includes:

  • Swipe (binary question [A or B.] response)
  • Single-answer question (A, B, or C)
  • Multi-answer question (any combination of A, B, and C.)
  • Rectangle drawing task (drawing a rectangle around a feature within a subject)
  • Single-image subjects
  • Multi-image subjects (e.g. uploading 2+ images as a single subject; users swipe up/down to display the different images)
  • Animated gifs as subjects
  • Subject auto-linking (automatically linking subjects retired from one workflow into another workflow of interest on the same project)
  • Push notifications (sending messages/alerts about new data, new workflows, etc., via the app)
  • Preview (an owner or collaborator on a project in development being able to preview a workflow in the ‘Preview’ section of the mobile app)
  • Beta Review (mobile enabled workflows are accessible through the ‘Beta Review’ section of the app for a project in the Beta Review process; includes an in-app feedback form)
  • Ability to see a list of all available projects, as well as filter by discipline (with active mobile app workflows listed at the top)

We also carried out a number of infrastructure improvements, including: 

  • Upgrades to the React Native libraries we use
  • Created a staging environment to test changes before they are implemented in full production
  • Additional test coverage
  • Implemented bug reporting and tracking
  • Complete documentation, so open source contributors can get the app running from our public code repository
  • And a myriad of additional improvements like missing icons no longer crashing the app, improvements to the rectangle drawing task, etc.

Note: we will continue developing the app; this is just the end of this phase of effort and a great time to share the results.

If you’re leading a Zooniverse project and have any questions about where in the Project Editor ‘workflow’ interface to ‘enable on mobile’, don’t hesitate to email contact@zooniverse.org. And/or if you’re a volunteer and wonder if workflow(s) on a given project could be enabled on mobile, please post in that project’s Talk to start the conversation with the research team and us. The more, the merrier!

Looking forward to having more projects on the mobile app!

A Few Stats of Interest:

  • Since Jan 1, 2020: 
    • 6.2 million classifications submitted via the app (that’s 7% of 86.7 million classifications total through Zooniverse projects)
    • 18,000 installations on iOS + 17,000 on Android
  • Current Active Users (people who have used the app in the last 30 days):
    • 1,800 on iOS + 7,700 on Android

Previous Blog Posts about the Zooniverse Mobile App: