Developers don't want security folks to babysit them, nor does security want to clean up avoidable messes.
"If you take the approach that security is the inspector of the developers work, then that's kind of setting you up for conflict," said Cindy Blake, senior security evangelist at GitLab. But there "will always be more developers than there are security people."
Developers and security professionals are overlapping more. Companies with security engrained in application development have faster deployment times, said Blake. "They're not adding a lot of value if they're spending all of their time jockeying spreadsheets to manage the list of vulnerabilities."
Sixty-percent of developers are releasing code twice as fast as last year, up 25% compared to pre-pandemic levels, according to GitLab's 2021 DevSecOps survey of about 4,300 global developers and security respondents.
While security's role in DevOps is improving, and inching closer to DevSecOps, companies still face hurdles involving security earlier in the development process. Some hurdles are of their own making with miscommunication, or balancing quick product releases with undesired security gaps.
Thirty-six percent of developers and security professionals practiced DevOps or DevSecOps development methods, up from 27% in 2020, according to the survey.
"They're not adding a lot of value if they're spending all of their time jockeying spreadsheets to manage the list of vulnerabilities."
Senior security evangelist at GitLab
Last year, companies made an effort to automate development processes — shifting left and including security earlier in the build process. More than 70% of security respondents said their teams shifted left, an increase from 65% in 2020.
But policies in the build process also need automation, Blake said. Companies can shape DevSecOps policies by asking:
- What do you do when you find a critical vulnerability?
- Does it trigger something?
- Does it stop the pipeline?
- Who has to look at it, if anyone?
"What I found in the very beginning of the pandemic was people were circumventing their policies, and just didn't track it very well," said Blake. "And when the policies become automated, then if you find that 90% of the time you're circumventing the policy, you probably don't really have a policy."
As companies shift left, 53% of developers run static application security testing (SAST), up from less than 40% in 2020, according to the report. Forty-four percent of respondents are running dynamic application security testing (DAST), up from 27% last year.
Alone, neither scan is sufficient, but the increase in their use is a nod to how far application security has come.
"With application security, it's people, processes and technology — it really effects how people do their job," said Blake. So the increase in SAST indicates progress in application security because application security has had to play catch up with other areas of security.
However, making the results of the scans available to developers is a constant hurdle. Only 20% of respondents say they bring the scan results into the web pipeline for their developers. Sixteen percent of respondents say DAST scans are "easily available" for developers. It was a minimal increase from 2020.
Because developers rely so heavily on third-party libraries and resources, "all of these other elements represent attack surfaces and a path for the bad guys to get in," said Blake. "If you're only testing the code, you could be missing all of these other elements."
The challenge is the security teams "I don't think have kept up with what the developers are using to know what those weak points could be," she said.
A matter of trust
For 42% of respondents, security testing is delayed too late in the development process, according to the survey. At that point, security professionals said it was too hard to process and mitigate found vulnerabilities.
"You sort of have to accept that you're always going to be behind. It's a matter of how behind can I be before we have a safety or trust problem," said James Arlen, CISO at Aiven. "You need to be able to have those frank conversations with product owners that say, 'This is something that can or will affect the utilization of the product.'"
The survey found developers and security professionals agreed they need to improve communication.
"Security people need developers to be more like security people and developers need security people to be more like developers," said Arlen. "It's not that either group needs to change for the benefit of the other, it's that both need to change for the benefit of all."
Companies have seen an improvement in developers' ability to catch bugs in software. This year, 45% of security respondents said their developers caught 25% or less bugs, leaving security with fewer bugs to find on their own. It was a decrease from the 93% of security respondents who said the same thing in 2020.
"Security people need developers to be more like security people and developers need security people to be more like developers."
CISO at Aiven
It's a challenge to balance both roles and the personalities that often fill them.
Communicating with developers is more seamless when security speaks to them in developer-familiar language. For example, "don't talk about vulnerabilities, because that has no real meaning in the development world," said Arlen. Instead, substitute the word for software detects; developers know how to treat and triage defects, not vulnerabilities.
In this example, a defect and a vulnerability are effectively the same thing, said Arlen, "so why don't we use their language?"
From the developer perspective, security professionals want to see their developers put on their "evil beard" and write application tests where it tries to break everything, according to Arlen. The practice goes against everything developers try to prove in tests, but it adopts the security mindset of "expect the worst."
"I need to write a test that tries to do a negative thing to your software," said Arlen. "Please, write all the positive tests that prove the software works, and then I want you to write a couple of tests that try to break your software."
Avoid hyperbolic language, and speak in terms of direct impact, he said. "That's why that retreat to their language, using terms like 'defect,' you're creating something that is more visceral to them."