Wikimedia Security Team/SDLC

From MediaWiki.org
Jump to navigation Jump to search

This page documents the desired process for teams at the WMF to engage the Security Team during their development process. Each interaction point may not be applicable to other developers.

Security and Privacy should be integrated throughout the software development process at the WMF. There are several points in the process where the Security Team must be involved, and specific training is available to help developers and product managers make good security decisions.

When specifically looking at the WMF product development process, which defines a Concept > Plan > Design > Build > Release > Maintain cycle, it's anticipated WMF engineering teams should consider security in the following phases.

Concept phase[edit]

Teams should consider, and possibly document, "does this feature make sense from a security perspective?". For example, a service that intentionally exposes sensitive information or violates the WMF privacy policy should probably not proceed past the concept phase. If there is any question about the prospective security impact of a project, please schedule a concept review with the Security Team. These typically can be accomplished with a 30 minute meeting, and scheduled within a few business days.

Plan and Design phases[edit]

When designing the architecture for your project, security and privacy should be considered regularly throughout the process.

Often, during the design phase your team can make certain design decisions that limit the ways your application can be attacked (minimizing the attack surface) or limiting the harm that can be done if your project is compromised. In addition to the secure design principles that we encourage for all developers, you might consider,

  • Are the hardware, uptime, and privacy requirements of this service or tool appropriate for a Labs instance, so a complete compromise of the underlying system as a result of this project will be limited to an untrusted labs instance?
  • Can this service or tool run on an untrusted domain, to limit the impact of a cross-site scripting vulnerability?
  • Can the functionality be limited to a set of trusted users?

For privacy assessment planning, you will need to document what types of data your project will collect, how it will be used within your project, how it will be transferred, where it will be stored, who will have access to it, and if any third-parties will be involved. If your project is likely to significantly impact user privacy, a Privacy Impact Analysis will need to be completed.

For both security and privacy, you should document a data-flow diagram (example) of your project. This should list,

  • The different users of your product or feature
  • The major, anticipated components that users will interact with (special pages, API modules, parser functions, service endpoints)
  • Any external services, applications, or scripts that your components will interact with
  • Where your data will be stored
  • How data flows between these users and systems

From this diagram, the Security Team can highlight the anticipated security controls that will need to be implemented by your project before it is released into production.

Once your team has documented the privacy impact and data-flow diagram, the Security Team can perform a design review. These typically can be completed in a 60 minute meeting with a Security Team member, and scheduled within 5-10 business days.

Build phase[edit]

As your team develops code for your project, they should integrate security controls into their process.

All team members writing or reviewing code should attend a Secure Code training, given regularly by the Security Team. If developers cannot attend the training when it is given, they can also review Security for developers and watch a recorded training session.

All code committed into your project should be reviewed by a second developer. Reviewers should check for security issues as part of their review, as outlined in the MediaWiki code review guidelines.

Frequently, your team will make use of other open source libraries within the project. When choose a library, prefer libraries supported by an organization with a dedicated security team who responds to security issues and have a track-record of releasing security updates in a timely manner. If the library is being added to MediaWiki's vendor repository, the library should be designed and documented in a way that makes correct use easy. Your team is responsible for appointing someone to ensure that the version of the library included with your project is kept up to date with all security patches.

Your project should have browser tests that document and exercise all interfaces into your code. The Security Team will use your browser tests to seed our security scanning.

Release phase[edit]

Except in rare cases, all new services, extensions, and major new features need a security review by a member of the Security Team, or a trusted delegate, before it can be deployed on any WMF sites. Once the review has been scheduled and started, the Security Team member will typically require about 1 hour to review every 1,000 lines of code in your project.

Due to limited capacity on the Security Team, these reviews should be scheduled before the start of the quarter.

Maintenance phase[edit]

The Security Team will scan your project weekly in beta with a security scanner. To benefit from this, your project should work correctly in beta, and browser tests should be kept up-to-date.

While your project is in production, your team is responsible for fixing any security issues that are reported in the code. Issues will be reported and tracked in Phabricator. Priority will be set by the security team according to our prioritization guidelines. Your team should develop security patches in accordance with our patch development process for security issues. If your project is released as part of the MediaWiki tarball, your team should collaborate with Release Engineering to release patches and make the issue public.

After a security issue is prioritized, Unbreak Now! issues should have plan for fixing the issue (with an estimated timeline) documented on the bug within 1 business day. Similarly, High priority issues should have a plan documented within one week of the issue being prioritized.

Non-WMF teams and projects[edit]

These guidelines were developed for WMF engineering teams, but projects developed by individual contributors or external organizations will also benefit from following this general structure if the project is to be deployed on WMF sites. In general, the more engaged with Security you are in the early phases of your project, the more comfortable and prepared we will be to work with you in the latter stages. Engaging with the team early will ensure that we can perform the deployment security review, a mandatory prerequisite for all projects deployed onto WMF sites, in a timely manner.

At minimum, you should:

  1. Create a task in phabricator in the #security-review project, requesting a review. The review will be prioritized and scheduled with other review requests, which may take a significant amount of time.
  2. Review Security for developers, and check your work for these common mistakes
  3. Coordinate with individuals or teams at the foundation, and work out a plan to maintain the project after deployment (fix security issues, patch libraries and services)