
Illinois Times partners with a new civic nonprofit called SeeGov to create video highlights of city council meetings to coincide with their local government coverage.
New tool combines AI, human judgment to create video highlights from public meetings
How Illinois Times is using See Gov to expand city council coverage


The 50-year-old Illinois Times in Springfield, Illinois, aims to cover every city council meeting, but with only one full-time staff reporter, it’s not always possible. To supplement their already consistent meeting coverage, they partnered with See Gov, a tool that helps civic creators and journalists create video highlights of local government meetings. While the tool partially leans on AI to transcribe and create summaries of the meetings, it requires a human to decide which highlights to include in these shorter videos.
The Innovation in Focus team interviewed Alex Rosen, executive director of See Gov and Michelle Ownbey, publisher of Illinois Times.
Anderson: What are some ways that you can make public meetings more accessible to the public?
Rosen: One of the few positive things that came about during COVID was [local governments] making more of these meetings available through livestream video. We also have artificial intelligence that can understand language very well and can help us review what happened in a meeting. And so when we bring those two together, we can get to what happened and then make what happened visible with unaltered video of the moments that matter in a meeting. We’re taking a three-hour meeting and bringing it down to like 10 to 15 minutes of just the discussions that are most impactful to the community.
Anderson: What role does (or should) AI play in covering public meetings?
Rosen: I actually don’t trust AI, and I should do more testing because it’s continuously evolving, but I have done testing around, “Hey, let me feed you a four-hour meeting, and you tell me what happened.” And the things that it’s going to pull out of that are not necessarily the most newsworthy items. It might have some bias about the way people speak and what seems important. How does [AI] know what’s important to a community? It doesn’t have that context.
So the way See Gov was built is to break down those very long transcripts into smaller bits, what we refer to as “moments,” that could be a minute to three or four minutes, and then summarize those moments. So the AI is just giving you a summary moment by moment of what happened.
We do go through all the moments and say, “Rate these moments as unlikely, maybe or likely to be moments that a local journalist would include in a video highlight reel of this.” Does it have a high impact on the community? Was there a debate back and forth on a topic? So there is a little bit of trying to start to do news judgment, but the human can see all the moment-by-moment summaries and then use their knowledge of the community and the context they have to decide what’s important and what they want to focus on in their highlights. I think it’s really important to say — because I’m not a journalist, and I have been learning about editorial judgment — it’s definitely the Alex Rosen judgment that gets into the highlights. But See Gov is designed as a tool that can be used by civic creators, newsrooms and influencers to pick out, from their point of view, what’s important in the meeting. The platform is there to hopefully have some interesting dialogue between people with different views who present things maybe with slightly different contexts. But ultimately, the core of what See Gov outputs is unaltered video of what people said. You can edit it, but in those edits, it’s always got a white flash between [the clips] because it’s very important to not mislead people with deceptive editing.
Lytle: Why were you interested in using See Gov?
Ownbey: We saw it as a way, with our limited staff and limited resources, to be able to provide more coverage of these hyperlocal issues that people care about, right? Because there are a thousand different ways to find out about what’s going on in Washington, D.C., but if you want to know what your city council or your school board or your county board is doing, there are very few ways to do that. The average person can’t spare three hours on a Tuesday night to go sit through this or even go through it the next day. We’re just all about educating the public and keeping people informed. And we do cover topics that come up at the city council and school board quite a bit, but not every week. We just don’t have the capacity to do that. So we thought of [See Gov] as working hand-in-hand with our existing coverage because if we were to write an article about something that was discussed at one of the meetings, we could link the video clips to the article and vice versa. If we’re not writing about anything that particular week, it’s just a good way to keep the public engaged and informed so that when we do write about it, they have more context or background about what’s going on.
Lytle: What does your workflow look like, and are you reviewing each highlight video that comes from See Gov?
Ownbey: I am in no way a fan of AI in general, but this is one area where I can see it being useful because not only is Alex checking it on his end, but I look it over as well. In fact, just today, Alex sent me the highlight reel from the most recent city council meeting, and in the notes next to the video, I noticed there was a typo in somebody’s name, and I just know the guy, so I know that it was a typo. And then [in the transcription], it was supposed to say “replatting,” and it said “replanting” when they were talking about a subdivision. So I just sent a quick note, and Alex has been super responsive. Even journalists you have to edit. I don’t see it as any different: if a journalist turns in a story to me, I’m not going to publish it without reading it.
After Alex sends me the reel for that week’s city council meeting, it takes me probably 10 or 15 minutes to get it posted and then get it scheduled for our social and our newsletter. Because I’m already watching the council meetings to be in the loop. I was doing that even before SeeGov.
Lytle: So you post these highlight videos on your website and in your newsletter?
Ownbey: We have a dedicated tab on the website, it goes out in one of our digital newsletters, and then it’s on our social media. And I know Alex had said he was working on some sort of search function, which I think would be fabulous. There are so many times when we are writing about a topic, and we know that it has been discussed at city council but we can’t remember exactly which meeting or who was it that was mad about this or that? That would definitely make it less time consuming.

Lytle: Anything else you’d like to share about your future plans for public meeting coverage?
Ownbey: Our big challenge is that a lot of these especially small municipalities and park district meetings don’t have a video component. So that’s where I think we might have to go the Documenters route. How do you incorporate that into your coverage if there is no film? Like our county board, for example, they have audio recordings but not video recordings.
Editor’s Note: This interview has been edited for clarity and brevity.

Sign up for the Innovation in Focus Newsletter to get our articles, tips, guides and more in your inbox each month!
Cite this article
Lytle, Emily; and Anderson, Sophia (2025, July 14). New tool combines AI, human judgment to create video highlights from public meetings. Reynolds Journalism Institute. Retrieved from: https://rjionline.org/news/new-tool-combines-ai-human-judgment-to-create-video-highlights-from-public-meetings/
Comments