Research roadmap update: November 2024
An update to the research that the User Experience team is running over the next quarter.
An update to the research that the User Experience team is running over the next quarter.
Ryan and Eira talk with Stack Overflow senior research analyst Erin Yepis about the results of our 2024 Developer Survey, which polled more than 65,000 developers about the tools they use, the technologies they want to learn, their experiences at work, and much more. Erin highlights what the survey reveals about devs’ favorite programming languages (JavaScript, HTML, Python), the rise of Rust, the popularity of embedded technologies (Raspberry Pi, Arduino), developer sentiment around AI, and why tech debt tops the list of developer frustrations.
An update to the research that the User Experience team is running over the next quarter.
With the ever-increasing importance of data, we’re always looking for expert voices that can expand our view of what data and our reliance on data means for software development and society as a whole. More and more of our lives are becoming data-driven. Is that a good thing?
For this episode, we spoke with Carol Lee, PhD, principal research scientist in the Developer Success Lab at Pluralsight, about her research into code review anxiety, how developers are coping, and how a workbook can help.
Dr. Cat Hicks, Director of Pluralsight Flow’s Developer Success Lab, joins Ben and Eira to talk about why ICs deserve recognition for their contributions to big projects (and how they can get it).
You read documentation and tutorials to become a better programmer, but if you really want to be cutting-edge, academic research is where it's at.
After speaking with subject matter experts, we decided to take a step back. In this post, we list and organize our methods of feedback into a matrix. The goal is to offer a clear framework to follow and to identify areas that could use bolstering with alternative methods.
We’re excited to share research highlights about the work we’ve been doing to understand how satisfied people are with Stack Overflow. We’ve been working hard to explore what users like best about Stack Overflow and what their top pain points are, with the goal of improving the overall experience of using the site. To this end, we’ve launched a site satisfaction survey, in which we continually survey users about their experiences using Stack Overflow.
Knowing our value and quantifying our value are two different things. Which is why we commissioned Forrester to conduct a T.E.I. (total economic impact) study. They sat down with four of our enterprise-sized customers and dug deep.
Last week, we told you about research that found a number of security vulnerabilities in code snippets in Stack Overflow answers, and how some of those flaws had migrated into actual, real-live Github projects. Today, we’re following up on the top eight error types that research highlighted and suggesting ways to avoid making the same mistakes.
Copying code itself isn’t always a bad thing. Code reuse can promote efficiency in software development; why solve a problem that has already been solved well? But when the developers use example code without trying to understand the implications of it, that’s when problems can arise.
But if you could collect and analyze the opinions posted within the comments and questions, you could start to get a bead on the aggregate sentiment, sort of a Yelp for technology. That’s just what Gias Uddin, now a Senior Data Scientist at the Bank of Canada, looked at for his PhD thesis at McGill University. Along with his PhD supervisor, Foutse Khomh, Associate Professor at Polytechnique Montréal, determined a method to mine opinions on APIs and libraries from questions and comments posted on Stack Overflow.