This is only roughly how it works at google.
You have your point testing analysis team to find faults and assist on what points of code need to be developed or redeveloped. then the manager of each coder department will know his peoples share of what points of code need to be completed.
Each point of code lets say is an average of 1000 letters or shall we say bytes.
A very good coder will do 10 to 20 of these per week
so 10KBytes they will work 8 hours a day and say 5 days a week.
so 10000/40
So 250 bytes an hour.
So 4.2 bytes a minute.
So a dismal 0.07Bytes/s a second or 1 bit every 2 seconds.
However if the team is 10,000 techies strong then each year that's 100MB worth of technical code which when compiled and graphics data is added makes 1-200+ GB and these GB's made by Top techies at 0.07 Bytes a second are very valuable as they contain the main functionality operating of the system and levels above and is the main optimal part of the code on your computer.
you could push someone closer to 0.2 Bytes a second but the closer you get to this the closer you are to a current mental threshold.
So the bar is actually quite low all the AI has to do is fit what the team manager needs in terms of points of code faster than a human per 20 watts and machine would be powerful enough to make a bigger dent in adding to how many code points can be melded together lead to far more efficient diverse technology functionality and AI is inching closer to this prospect all the time.
This also demonstrates how fast typing is not the solution to being a good coder at the moment fast optimal code engineering based thinking is.
This is ignoring what they may also type in the same time in terms of notaion I'm talking mentally thought out code throughput and it makes sense if you think there's equivalent to 38,000,000 top coders working at this pace at any time so that's 2.7 MBytes a second of coding or 85TB which when all the data that goes round this coding and the compiled output is taken into account you get the internet of functionality and code diversity you see before you on the net corperatte research get the lions share of these technical TBytes.
If we could work with development of code at 10 Bytes/s and be able to handle more points of code at the same time the amount of good technical code would be closer to 1PB 0.2% of what is added to youtube every year in video content or like 0.05% of the data the top super computers produce each year.
This would make for a significant boost to the functionality of the internet many current software pipe dreams would become a reality there would be far more functional options for consumers.
and 10 characters a second is doable we do this easily when we talk.
so a talking based programming interface has the potential to move us up a gear and better handle repetition and move us closer to the 1 PB worth of technical base code generated from human input.
12Kbit/s i think is at the mental limit and it could easily take five generations or more before people can achieve this throughput with the computer in terms of human intent complexity handling that degree computational complexity.
although is well possible to type faster than most people talk we process at talk pace and a talk based programming interface could allow people to work with 50-100 times the amount of code points in a week. so the talking coders would be more like the mini team mangers more directly linked to the point testers and with no one below them. why would you talk code well when asking for things naturally you wouldn't tend to write it down you can be quick and more direct with your tong as AI gets more advanced this will make sense with more tasks.
AI already can do well over 1PB of some types of intelligence generation but human tong with AI might do more for what humans can develop and co-ordinate.
Again no need to put chips in the brain just to focus the AI effort aiming for humanity to fit nicely in the picture especially our creative representation rights.