Atmosphere Scores

A simple formula for ranking items by popularity over time

Selecting a package to use in your project is a fraught process. Perhaps you’re lucky and there are several options from which to choose. How do you make that decision?

Picking a package that isn’t battle-tested may present serious bugs that have yet to be encountered. While picking a package that hasn’t seen activity in months may lead to wasted time and further deprecation of the codebase as updates arrive.

If you’re likely to trust packages that are well used by production apps under real load, you’re in the majority. Based on developer research and polling our own team, we’ve discovered a handful of indicators that help developers make the decision on which package to integrate into their app.

Why check the various metrics for package reliability every time you look for a package? Atmosphere simplifies the decision process by synthesizing these indicators into the Atmosphere Score.

One score to rule them all

There are many signals that developers rely on to get a feel for package quality, some quantifiable; others less so. Atmosphere package manager attempts to gather and surface many quantifiable metrics –without making things confusing.

However, when it comes to ordering packages in places like the search results, we’re only able to leverage a single dimension. So we need to combine these metrics into a synthetic score to determine sort order. Enter the Atmosphere Score.

The metrics we use

As of this writing we use two metrics to score a package:

  • Number of downloads, both historical and recent.
  • Number of days since the last release.

We plan on expanding our coverage to encompass signals like:

  • Github activity
    • Recent commit activity.
    • Number of open/closed bugs and pull requests.
    • Number of stars.
  • Real app usage
    • How many downloads are in unique apps
    • How much traffic those apps receive.
  • Atmosphere activity
    • Views
    • Ratings
    • Commenting
  • Social activity
    • Twitter mentions
    • etc.

How we combine them (the not-so-secret sauce)

We calculate the Atmosphere score once a day. Although the package download count can increase often over the course of a day, the score itself doesn’t end up changing all that much over the course of the day. (This doesn’t hold true for recently updated packages and is something we’ll revisit in a later Atmosphere revision).

The nascent Atmosphere score formula is simple. It uses exponential decay over time to a base popularity, determined by total download counts, with a small bias towards recent downloads. It’s essentially the same algorithm as used by sites such as Reddit, Hacker News, and Telescope.

  score = (installsInWeek + installsInMonth/2) * (daysSinceUpdate + 30) ^ -1.3

How we store the scores

The Atmosphere Score is the basic unit to rank packages so we must store the latest score (calculated daily) on the package document itself. Since we also want to rank trending packages (based on a 7 day change in score) we also need to register the delta on the package as well.

To calculate that delta, and to display the 7-day trend histogram on the package, we store all scores each day, for each package in a Scores collection.

Serving the scores and drawing the histogram

iron-router If you inspect the HTML of the sparkline on the package lockup it reveals a simple SVG that’s used to draw the graph. A package like D3 can be useful for such things.
To get the seven scores required for each package we list on the page, we use a “client side join” publication (as detailed here). You’ll notice that when the Atmosphere homepage first loads, there’s a moment when the graphs are not loaded. However, as the lockup works without them, it’s totally fine and works well for this experience.

The Future

As Meteor gains popularity and the volume of packages grow, the Atmosphere Score will be invaluable in helping developers to find the best package for their needs. With the addition of more metrics, we expect the Atmosphere Score can better determine package trust and quality.

As always, let us know your suggestions in the comments or the issue tracker. Onwards.

Get the Newsletter

3 Comments

  1. Carlo DiCelico

    Absence of quality control on Atmosphere has bitten me time and time again, so this effort is wonderful. However, the formula is still not a particularly good indicator of the package’s quality, merely its popularity. Some packages are rock solid but not really used much since most people don’t have a need for it or simply prefer to replicate that package’s functionality themselves. I think open issues on Github should weigh more highly, as well as the number of passing tests and code coverage of those tests. These three metrics taken together seem like they would be far better indicators of the quality of a package than number of downloads.

    Jul 17, 2014 Reply
    • Francisco Calle Moreno

      +1 to raise the importance of the github issues

      Jul 19, 2014
    • David Backeus

      Not sure how reliable that is. Number of issues are usually very much related to the popularity of the project. The more users of a repo the more issues are produced.

      Also an issue is sometimes not an issue. Some issues are out of the scope of the project but are left open for the sake of documentation. Some are pull requests with working solutions to a problem. Some github users create issues on their own repos as a kind of public TODO list etc.

      Jul 28, 2014

Previous Post

Design for Realtime

User experience principles for realtime and reactive systems

Next Post

Meteor Packaging Q&A

Meteor core engineer Ekate answers all our packaging questions