Investors continue to pour money into generative AI technologies.For example, Replit, a Integrated Development Environment Started developing a code-generating AI tool called Ghostwriter this week raised Nearly $100 million ($97.4 million) at a post-money valuation of $1.16 billion.
Andreessen Horowitz led the round—an extension of the Series B—with participation from Khosla Ventures, Coatue, SV Angel, Y Combinator, Bloomberg Beta, Naval Ravikant, ARK Ventures, and Hamilton Helmer.
“We are relentlessly focused on empowering the one billion software developers with product experiences, expanding Replit’s cloud services and ‘driving innovation in artificial intelligence,'” Replit founder and CEO Amjad Masad said in a statement.
“AI is already bringing this future closer,” Massad continued. “We look forward to expanding our offerings for professional developers.”
Based in San Francisco, Replit was co-founded in 2016 by programmers Amjad Masad and Faris Masad and designer Haya Odeh. Before founding Replit, Amjad Masad held engineering positions at Yahoo and Facebook, where he built software development tools.
But perhaps its headline feature is Ghostwriter, a suite of features powered by AI models trained on publicly available code. Ghostwriter — much like GitHub’s Copilot — can make suggestions and interpret code, taking into account what users type and other context in their accounts, such as the programming language they’re using.
Ghostwriter appears to be the driving force behind Replit’s recent explosive growth, resulting in partnership With Google Cloud and a user base of over 22 million developers. But like all generative AI tools, it comes with risks — and potential legal ramifications that haven’t yet been fully exploited in court.
Microsoft, GitHub, and OpenAI are working on prosecute In the class action, they are accused of violating copyright law by allowing Copilot to introspect portions of the licensed code without giving credit. Liability aside, some legal experts say AI like Copilot could put companies at risk if they inadvertently incorporate copyrighted advice from the tool into their production software.
It’s unclear whether Ghostwriter also received training on licensing or copyright code. But Replit did note that the code suggested by Ghostwriter may contain “incorrect, offensive or otherwise inappropriate” strings.
This includes unsafe code. Software engineers who use code to generate artificial intelligence systems are more likely to cause security holes in the applications they develop, according to a recent Stanford University study. While the research didn’t target Replit specifically, it stands to reason that developers using it would fall victim to the same.
That’s all, Replit has its work to do.