Wikimedia Security Team/SDLC

Security and Privacy should be integrated throughout the software development process at the WMF. There are several points in the process where the Security Team must be involved, and specific training is available to help developers and product managers make good security decisions.

When specifically looking at the WMF product development process, which defines a Concept > Plan > Design > Build > Release > Maintain cycle, it's anticipated teams should consider security in the following phases.

Concept phase
Teams should consider, and possibly document, "does this feature make sense from a security perspective?". For example, a service that intentionally exposes sensitive information or violates the WMF privacy policy should probably not proceed past the concept phase. If there is any question about the prospective security impact of a project, please schedule a concept review with the Security Team. These typically can be accomplished with a 30 minute meeting, and scheduled within a few business days.

Plan and Design phases
When designing the architecture for your project, security and privacy should be considered regularly throughout the process.

Often, during the design phase your team can make certain design decisions that limit the ways your application can be attacked (minimizing the attack surface) or limiting the harm that can be done if your project is compromised. In addition to the secure design principles that we encourage for all developers, you might consider,
 * Are the hardware, uptime, and privacy requirements of this service or tool appropriate for a Labs instance, so a complete compromise of the underlying system as a result of this project will be limited to an untrusted labs instance?
 * Can this service or tool run on an untrusted domain, to limit the impact of a cross-site scripting vulnerability?
 * Can the functionality be limited to a set of trusted users?

For privacy assessment planning, you will need to document what types of data your project will collect, how it will be used within your project, how it will be transferred, where it will be stored, who will have access to it, and if any third-parties will be involved. If your project is likely to significantly impact user privacy, a Privacy Impact Analysis will need to be completed.

For both security and privacy, you should document a data-flow diagram (example) of your project. This should list, From this diagram, the Security Team can highlight the anticipated security controls that will need to be implemented by your project before it is released into production.
 * The different users of your product or feature
 * The major, anticipated components that users will interact with (special pages, API modules, parser functions, service endpoints)
 * Any external services, applications, or scripts that your components will interact with
 * Where your data will be stored
 * How data flows between these users and systems

Once your team has documented the privacy impact and data-flow diagram, the Security Team can perform a design review. These typically can be completed in a 60 minute meeting with a Security Team member, and scheduled within 5-10 business days.

Build phase
As your team develops code for your project, they should integrate security controls into their process.

All team members writing or reviewing code should attend a Secure Code training, given regularly by the Security Team. If developers cannot attend the training when it is given, they can also review Security for developers and watch a [http://#|a recorded training session].

All code committed into your project should be reviewed by a second developer. Reviewers should check for security issues as part of their review, as outlined in the MediaWiki code review guidelines.

Frequently, your team will make use of other open source libraries within the project. When choose a library, prefer libraries supported by an organization with a dedicated security team who responds to security issues and have a track-record of releasing security updates in a timely manner. If the library is being added to MediaWiki's vendor repository, the library should be designed and documented in a way that makes correct use easy. Your team is responsible for appointing someone to ensure that the version of the library included with your project is kept up to date with all security patches.

Your project should have browser tests that document and exercise all interfaces into your code. The Security Team will use your browser tests to seed our security scanning.

Release phase
Except in rare cases, all new services, extensions, and major new features need a security review by a member of the Security Team, or a trusted delegate, before it can be deployed on any WMF sites. Once the review has been scheduled and started, the Security Team member will typically require about 1 hour to review every 1,000 lines of code in your project.

Due to limited capacity on the Security Team, these reviews should be scheduled before the start of the quarter.

Maintenance phase
The Security Team will scan your project weekly in beta with a security scanner. To benefit from this, your project should work correctly in beta, and browser tests should be kept up-to-date.

While your project is in production, your team is responsible for fixing any security issues that are reported in the code. Issues will be reported and tracked in Phabricator. Priority will be set by the security team according to our prioritization guidelines. Your team should develop security patches in accordance with our patch development process for security issues. If your project is released as part of the MediaWiki tarball, your team should collaborate with Release Engineering to release patches and make the issue public.

After a security issue is prioritized, Unbreak Now! issues should have plan for fixing the issue (with an estimated timeline) documented on the bug within 1 business day. Similarly, High priority issues should have a plan documented within one week of the issue being prioritized.