Like most big tech companies these days, YT/Google is primarily interested in automation (in this case of content flagging). It'd be interesting to know if the new staff will be flagging content manually (doubtful), reviewing content that has already been flagged by their algorithms (mos likely) or refining their automated flagging algorithms (also possible). “Today, 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms,” Wojcicki wrote. “Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly half of it in two hours and we continue to accelerate that speed,” she added.
Sounds like their most useful would be at the "manual appeal" station. It's where creators whose content got flagged send request for manual reviews, after which most videos lose their flagging. Also sounds like Google is massive on algorithms. They're the technoshamans of the big stage.