❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierVos flux RSS

Use This AI To Find Your Next Engineering Gig

Par Tekla S. Perry
IEEE COVID-19 coverage logo, link to landing page

Rocket, a startup that builds AI-based tools for recruiters and offers recruitment services of its own, had been on the verge of profitability when  the coronavirus pandemic hit. It quickly saw corporate recruitment pushes dry up.

With the pause in hiring affecting its main business and mounting layoffs at tech companies, Rocket turned its efforts towards matching laid off engineers with new jobs pro bono. Rocket gathered data on layoffs, set its AI software and recruiters on cleaning up the data and making it easier to navigate, and launched a new portal to the data. Dubbed Parachute, the portal now has 13,500 professionals from around the world on its original, U.S.-based English-language site, and Rocket is bringing up sister sites in other countries.

Here’s what Rocket cofounder and CEO Abhinav Agrawal had to say when IEEE Spectrum talked to him this month about the pandemic’s effect on tech jobs, creating Parachute, and what he sees for tech careers on the near horizon.

On the effect of the pandemic on tech jobs:

Abhinav Agrawal: “When Covid hit, we saw companies scaled back hiring almost immediately. As soon as companies started working from home—in early to mid-February—they became cautious in hiring. Interviewing took a big hit; we had all been conditioned to believe that interviewing in person was the gold standard, so a lot of candidates were impacted as interviews were pushed out or cancelled until they could be rescheduled in person.

“Then, as the market demands started softening, you had startups going into self-preservation mode. Prominent venture capitalists issued advisories to their portfolio companies, saying that winter is coming, so it is time to flip from growth at all costs to let’s batten down the hatches and survive this.

“Right now, we are in purgatory. We are seeing about a third of companies going straight ahead with hiring, a third just tiptoeing back in, and about a third sitting in survival mode [and not hiring at all].”

On interviewing in a pandemic:

Agrawal: “The process is completely changing. On-site interviews in four to five hours in a stretch are out the window. That’s been challenging for a lot of senior engineering leaders who like to do whiteboard interviews and are having to adapt.

“We are also finding that not being able to meet someone in person, makes it harder to believe your judgement—to sense fit—so people instead of doing three on site interviews are doing maybe five zoom interviews.”

On the impact of working from home:

Agrawal: “Initially, people still wanted to hire people in their headquarters’ city. But now, with companies doing fine with remote workers, people are wondering if they should look for candidates outside their headquarter cities, that even when office environments come back, they can still hire remote workers. Companies are, however, focused on the candidates in the headquarters’ time zone—looking at, say, Seattle, Los Angeles, and Latin America. Being in the same time zone makes collaboration easier.

“There are [salary] implications. People at every company we talk to are having difficult conversations internally about how to manage that. If current employees want to move, they say, ‘What do we do about salaries? Do we have different bands for different cities? Do we penalize them for moving?’ These are tough conversations companies are now having.”

On layoffs and starting Parachute:

Agrawal: “We saw layoffs starting to happen, even before the lockdowns. First they were in in concentrated industries, like building software for local retail, restaurant operations, and travel. By early April, we could see layoffs were beginning to escalate. Now we are hopefully on the other side of the peak of the curve, but layoffs are still trickling through. Companies that thought they could ride it out are starting to think they can’t, they might have thought [the pandemic] would be a couple of months; well it’s July now, and things could be worse in the Fall.

“Pretty early on we started thinking about how to help the community [of laid off tech professionals] and recruiters. There are a couple of issues in layoffs.

“For one, it’s hard to know exactly who at a company has been laid off when they announce, say that they have laid off 25 percent of their workforce. If you reach out at random, you’ll find three out of four people weren’t laid off.

“Second, there is a stigma around layoffs, so often people won’t mention  that they’ve been laid off. But if layoff information is institutionalized, becomes a platform, and top companies are hiring there, it takes the stigma away.

“So we thought it would be great if there were a centralized place where recruiters could access data and reach out directly.

“We get the data from three sources. Sometimes, we work directly with companies doing a layoff to offer an opt-in [to our service]. We did that with the HR team at Lyft.

 “We also look at publicly available spreadsheets or lists. These are sometimes set up by a coworker, or someone in HR; the formats vary a lot. We only use these when the information is available publicly. And we remove anybody on those lists who contacts us, we try to do that within an hour of being notified. Finally, you can sign up directly on

“Then we standardize the data, which is challenging. Even something like “Bay Area,” people will say San Francisco, SF, Peninsula. We also try to standardize job functions and titles. Our AI tools take a first pass. They will, for example, ingest what someone does based on a description given, then give [the job] a standardized title, calculate years of experience, and label the region. They also , extract skills from a resume and standardize those to make them searchable. AI gets about 80 to 90 percent of the cases right, but we still have a human review every profile, to look for edge cases.

“We started building Parachute right about when lockdown started. We did a soft launch at end of April, started getting word out in early May. Right now, we have 13,500 on the list from all over the world—we had people signing up in India, Dubai, Chile, New Zealand, Australia, Europe, and Nigeria. We now have a Spanish language version now live in Chile, and we have Israeli and French versions in the works. We are continuing to make improvements in how we ingest data and bring people on.

“We have candidates from just about every tech layoff that happened, like Uber, Lyft, Airbnb, Peerspace, and WeWork. We have several thousand recruiters using it, from every major company, like Facebook, Amazon, Apple, and Twitter on the big side, to Coinbase, Shift, and Doximity on the small side. We’ve had verified recruiters reach out to candidates from our platform 25,000 times; generally, mid-level professionals are getting the most interest. We don’t know how many have been hired; because we aren’t charging anybody for this service we don’t have a good way of tracking them. We have received more than 150 emails from professionals letting us know, but we believe the real number is two to three times that.

“My hope is that the need for this product goes away. It’s a weird thing, you build a product, but you actually aren’t hoping that it gets more use.”

AI Recruiting Tools Aim to Reduce Bias in the Hiring Process

Par Jeremy Hsu

Two years ago, Amazon reportedly scrapped a secret artificial intelligence hiring tool after realizing that the system had learned to prefer male job candidates while penalizing female applicants—the result of the AI training on resumes that mostly male candidates had submitted to the company. The episode raised concerns over the use of machine learning in hiring software that would perpetuate or even exacerbate existing biases.

Now, with the Black Lives Matter movement spurring new discussions about discrimination and equity issues within the workforce, a number of startups are trying to show that AI-powered recruiting tools can in fact play a positive role in mitigating human bias and help make the hiring process fairer.

These companies claim that, with careful design and training of their AI models, they were able to specifically address various sources of systemic bias in the recruitment pipeline. It’s not a simple task: AI algorithms have a long history of being unfair regarding gender, race, and ethnicity. The strategies adopted by these companies include scrubbing identifying information from applications, relying on anonymous interviews and skillset tests, and even tuning the wording of job postings to attract as diverse a field of candidates as possible.

One of these firms is GapJumpers, which offers a platform for applicants to take “blind auditions” designed to assess job-related skills. The startup, based in San Francisco, uses machine learning to score and rank each candidate without including any personally identifiable information. Co-founder and CEO Kedar Iyer says this methodology helps reduce traditional reliance on resumes, which as a source of training data is “riddled with bias,” and avoids unwittingly replicating and propagating such biases through the scaled-up reach of automated recruiting.

That deliberate approach to reducing discrimination may be encouraging more companies to try AI-assisted recruiting. As the Black Lives Matter movement gained widespread support, GapJumpers saw an uptick in queries from potential clients. “We are seeing increased interest from companies of all sizes to improve their diversity efforts,” Iyers says.

AI with humans in the loop

Another lesson from Amazon’s gender-biased AI is that paying close attention to the design and training of the system is not enough: AI software will almost always require constant human oversight. For developers and recruiters, that means they cannot afford to blindly trust the results of AI-powered tools—they need to understand the processes behind them, how different training data affects their behavior, and monitor for bias.

“One of the unintended consequences would be to continue this historical trend, particularly in tech, where underserved groups such as African Americans are not within a sector that happens to have a compensation that is much greater than others,” says Fay Cobb Payton, a professor of information technology and analytics at North Carolina State University, in Raleigh. “You’re talking about a wealth gap that persists because groups cannot enter [such sectors], be sustained, and play long term.”

Payton and her colleagues highlighted several companies—including GapJumpers—that take an “intentional design justice” approach to hiring diverse IT talent in a paper published last year in the journal Online Information Review.

According to the paper’s authors, there is a broad spectrum of possible actions that AI hiring tools can perform. Some tools may just provide general suggestions about what kind of candidate to hire, whereas others may recommend specific applicants to human recruiters, and some may even make active screening and selection decisions about candidates. But whatever the AI’s role in the hiring process, there is a need for humans to have the capability to evaluate the system’s decisions and possibly override them.

“I believe that human-in-the-loop should not be at the end of the recommendation that the algorithms suggest,” Payton says. “Human-in-the-loop means in the full process of the loop from design to hire, all the way until the experience inside of the organization.”

Each stage of an AI system’s decision point should allow for an auditing process where humans can check the results, Payton adds. And of course, it’s crucial to have a separation of duties so that the humans auditing the system are not the same as those who designed the system in the first place.

“When we talk about bias, there are so many nuances and spots along this talent acquisition process where bias and bias mitigation come into play,” says Lynette Yarger, a professor of information sciences and technology at Pennsylvania State University and lead author on the paper with Payton. She added that “those companies that are trying to mitigate these biases are interesting because they’re trying to push human beings to be accountable.”

Another example highlighted by Yarger and Payton is a Seattle-based startup called Textio that has trained its AI systems to analyze job advertisements and predict their ability to attract a diverse array of applicants. Textio’s “Tone Meter” can help companies offer job listings with more inclusive language: Phrases like “rock star” that attract more male job seekers could be swapped out for the software’s suggestion of “high performer” instead.

“We use Textio for our own recruiting communication and have from the beginning,” says Kieran Snyder, CEO and co-founder of Textio, which is based in Seattle. “But perhaps because we make the software, we know that Textio on its own is not the whole solution when it comes to building an equitable organization—it’s just one piece of the puzzle.”

Indeed, many tech companies, including those that develop AI-powered hiring tools, are still working on inclusion and equity. Enterprise software company Workday, founded by former PeopleSoft executives and headquartered in Pleasanton, Calif., has more than 3,700 employees worldwide and clients that include half the Fortune 100. During a company forum on diversity and racial bias in June, Workday acknowledged that Black employees make up just 2.4 percent of its U.S. workforce versus the average of 4.4 percent for Silicon Valley firms, according to SearchHRSoftware, a human resources technology news site.

AI hiring tools: not a quick fix

Another challenge for AI-powered recruiting tools is that some customers expect them to offer a quick fix to a complex problem, when in reality that is not the case. James Doman-Pipe, head of product marketing at Headstart, a recruiting software startup based in London, says any business interested in reducing discrimination with AI or other technologies will need significant buy-in from the leadership and other parts of the organization.

Headstart’s software uses machine learning to evaluate job applicants and generate a “match score” that shows how well the candidates fit with a job’s requirements for skills, education, and experience. “By generating a match score, recruiters are more likely to consider underprivileged and underrepresented minorities to move forward in the recruiting process,” Doman-Pipe says. The company claims that in tests comparing the AI-based approach to traditional recruiting methods, clients using its software saw a significant improvements in the diversity makeup of new hires.

Still, one of the greatest obstacles AI-powered recruiting tools face before they can gain widespread trust is the lack of public data showing how different tools can help—or hinder—efforts to making tech hiring more equitable.

“I do know from interviews with software companies that they do audit, and they can go back and recalibrate their systems,” Yarger, the Pennsylvania State University professor, says. But the effectiveness of efforts to improve algorithmic equity in recruitment remain unclear. She explains that many companies remain reluctant to publicly share such information because of liability issues surrounding equitable employment and workplace discrimination. Companies using AI tools could face legal consequences if the tools were shown to discriminate against certain groups groups.

For North Carolina State’s Payton, it remains to be seen whether corporate commitments to addressing diversity and racial bias will have a broader and lasting impact in the hiring and retention of tech workers—and whether or not AI can prove significant in helping to create equitable an workforce.

“Association and confirmation biases and networks that are built into the system, those don’t change overnight,” she says. “So there’s much work to be done.”

Black Tech Professionals Are Still Paid Less Than Their White Colleagues

Par Tekla S. Perry
Illustration of black tech professional's hand holding paper currency that is breaking apart into the air.
Illustration: Harry Campbell

The gap between the average salary offered to black tech professionals and what’s offered to white tech professionals is closing at a snail’s pace. According to an analysis by the job search firm Hired, in 2019 black tech professionals were offered an average of US $10,000 a year less than white tech workers. That’s slightly better than the 2018 gap of $11,000, but not much better.

Meanwhile, Hispanic tech professionals lag $3,000 behind their white counterparts, down from $7,000 in 2019. Asian tech professionals, having pulled ahead in recent years, continue to command a slight edge in average salaries over their white colleagues.

Within each racial group, tech professionals who identified themselves as female received lower average salary offers than their male counterparts, according to Hired’s 2020 State of Wage Inequality in the Workplace Report, released earlier this year.

One promising takeaway from Hired’s 2020 State of Salaries Report was that tech salaries grew in the United States, Canada, and the United Kingdom in 2019, with the U.S. average up to $146,000 (an 8 percent increase over 2018) and the average of all three regions up to $130,000 (a less than 1 percent increase).

Or at least the trend would have been promising, had things not changed so much between the close of 2019 and today. Under normal conditions, the information contained in Hired’s salaries report would be seen as a trend line that would progress into the upcoming year. But what it means right now, in the midst of the coronavirus pandemic, is anyone’s guess.

“Tech headquarters are closed, work from home is the new normal, Amazon and Netflix usage is soaring, well-known tech unicorns like Uber, Lyft, and Airbnb are laying off thousands,” the Hired report states.

Meanwhile, Facebook is looking to adjust salaries up or down based on the cost of living in exchange for the freedom to work remotely. Whether the new normal will ever return to the old normal remains unclear.

Still, it’s worth looking at the gains made in 2019, because we’ll certainly be referring to these numbers as we monitor the engineering jobs marketplace during and after the pandemic.

According to Hired’s analysis, tech workers in Austin and Toronto saw the biggest increase in salary offers, both up 10 percent over 2018.

Still, in terms of raw numbers, average tech salaries in the San Francisco Bay Area remained on top, averaging $155,000 (up 7 percent) in 2019. When salaries are adjusted for the cost of living, though, many regions are well ahead of the San Francisco Bay Area: Austin’s $137,000 annual 2019 salary, for example, is equivalent to a salary of $224,000 in the Bay Area. But recent announcements by large tech companies about adjusting salaries based on the cost of living when their employees relocate may change that picture over the next 12 months.

This article is based on two posts in our View From the Valley blog.

Top Programming Languages 2020

Par Stephen Cass

It would be an understatement to say it’s been a turbulent year since the last time IEEE Spectrum broke out the digital measuring tools to probe the relative popularity of programming languages. Yet one thing remains constant: the dominance of Python.

Since it’s impossible for even the most aggressive spy agency in the world to find out what language every single programmer uses when they sit down at their keyboards—especially the ones tapping away on retro computers or even programmable calculators—we rely on combining 11 metrics from online sources that we think are good proxies for the popularity of 55 languages.

Because different programmers have different interests and needs, our online rankings are interactive, allowing you to weight the metrics as you see fit. Think one measure is way more valuable than the others? Max it out. Disagree with us about the worth of another? Turn it off. We have a number of preset rankings that focus on things such as emerging languages or what jobs employers are looking to fill (big thanks to CareerBuilder for making it possible to query their database this year, now that it’s no longer accessible using a public application programming language).

screenshot of the the top ten list from the app

Our default ranking is weighted toward the interests of an IEEE member, and looking at the top entries, we see that Python has held onto its comfortable lead, with Java and C once again coming in second and third place, respectively. Arduino has seen a big jump, rising from 11th place to seventh. (Purists may argue that Arduino is not a language but rather a hardware platform that is programmed using a derivative of Wiring, which itself is derived from C/C++. But we have always taken a very pragmatic approach to our definition of “programming language,” and the reality is that when people are looking to use an Arduino-compatible microcontroller, they typically search for “Arduino code” or buy books about “Arduino programming,” not “Wiring code” or “C programming.”)

One interpretation of Python’s high ranking is that its metrics are inflated by its increasing use as a teaching language: Students are simply asking and searching for the answers to the same elementary questions over and over. There’s an historical parallel here. In the 1980s, BASIC was very visible—there were books, magazines, and even TV programs devoted to the language. But few professional programmers used it, and when the home computer bubble burst, so did BASIC’s, although some advanced descendants like Microsoft Visual Basic are still relatively popular professionally.

There are two counterarguments, though: The first is that students are people, too! If we pay attention only to what professional and expert coders do, we’re at risk of missing an important part of the picture. The second is that, unlike BASIC, Python is frequently used professionally and in high-profile realms, such as machine learning, thanks to its enormous collection of high quality, specialized libraries.

However, the COVID-19 pandemic has left some traces on the 2020 rankings. For example, if you look at the Twitter metric alone in the interactive, you can see that Cobol is in seventh place. This is likely due to the fact that in April, when we were gathering the Twitter data, Cobol was in the news because unemployment benefit systems in U.S. states were crashing under the load as workers were laid off due to lockdowns. It turns out that many of these systems had not been significantly upgraded since they were created decades ago, and a call went out for Cobol programmers to help shore them up.

There’s always a vibrant conversation about Spectrum’s Top Programming Languages online, so we encourage you to explore the full rankings and leave comments there, particularly if you want to nominate an emerging language for inclusion in next year’s rankings.

This article appears in the August 2020 print issue as “The Top Programming Languages.”