Book Talk - The Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future

Date: 

Tuesday, October 29, 2019, 12:00pm to 1:15pm

Location: 

Ash Center foyer, 124 Mt Auburn Street, Suite 200N

Jump to audio recording and transcription

Event Description

The Ash Center invites you to a book talk with Ben Green, author of The Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future. Kathy Pham, Adjunct Lecturer in Public Policy at HKS, will serve as a respondent. Stephen Goldsmith, the Director of the Ash Center's Government Innovations Program, will moderate.

Lunch will be served. 

The Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future Book TalkAbout the Book

In The Smart Enough City, Ben Green warns against seeing the city only through the lens of technology; taking an exclusively technical view of urban life will lead to cities that appear smart but under the surface are rife with injustice and inequality. He proposes instead that cities strive to be “smart enough”: to embrace technology as a powerful tool when used in conjunction with other forms of social change—but not to value technology as an end in itself.

In a technology-centric smart city, self-driving cars have the run of downtown and force out pedestrians, civic engagement is limited to requesting services through an app, police use algorithms to justify and perpetuate racist practices, and governments and private companies surveil public space to control behavior. Green describes smart city efforts gone wrong but also smart enough alternatives, attainable with the help of technology but not reducible to technology: a livable city, a democratic city, a just city, a responsible city, and an innovative city. By recognizing the complexity of urban life rather than merely seeing the city as something to optimize, these Smart Enough Cities successfully incorporate technology into a holistic vision of justice and equity.

About the Author

Ben Green is a PhD Candidate in Applied Math at the Harvard School of Engineering and Applied Sciences and an Affiliate at the Berkman Klein Center for Internet and Society at Harvard. He studies the implementation and impacts of data science in local governments, with a focus on “smart cities” and the criminal justice system. Analyzing the intersections of data science with law, policy, and social science, Ben focuses on the social justice and policy implications of data-driven algorithms deployed by governments. 

Event Audio Recording and Transcription

Transcript

Presenter: You're listening to AshCast, the podcast of the Ash Center for Democratic Governance and Innovation at Harvard Kennedy School.

Ben Green: Are the systems actually possible? Is the technology capable of doing what is promised? And if this technology is capable of doing that, would we want that to be implemented? Are the visions that are being promoted desirable ones for the future of cities? The core argument of my work and my book is that the answer to both of these questions is no. The smart city is a vision full of false promises and hidden dangers.

Presenter: In The Smart Enough City: Putting Technology in Its Place to Reclaim our Future, Ben Green warns against seeing the city only through the lens of technology. Taking an exclusively technical view of urban life will lead to cities that appear smart, but under the surface are rife with injustice and inequality. He proposes instead that cities strive to be smart enough, to embrace technology as a powerful tool when used in conjunction with other forms of social change, but not to value technology as an end in itself.

Presenter: On Tuesday, October 29th, Green joined Kathy Pham, adjunct lecturer in public policy at Harvard Kennedy School, for a discussion at the Ash Center. Stephen Goldsmith, director of the Ash Center's Government Innovations Program, and Daniel Paul, professor of the practice of government, moderated.

Stephen Goldsmith: Welcome. Thanks for joining us for this conversation about Smart Enough. We've got two great presenters, one who wrote the book, Ben Green, who is affiliated with Berkman and NYU and is an expert on many of the issues that we talk about over here, equity and fairness in algorithms and criminal justice and social policy. Kathy is an expert in ethics and technology, teaches at HKS, and is affiliated with the Berkman Center as well, and Shorenstein.

Stephen Goldsmith: What we're going to do is have Ben speak, and then Kathy rebut what he said, and then turn it over to the audience.

Stephen Goldsmith: The book is an interesting one, I would say, Ben. Years ago, I was sitting in New York City Hall, and this fellow came up and said, "We have 10,000 phone booths in New York City that are put to little use other than for vandalism. What else could they be?" From that, they became 10,000 LinkNYC smart technology pedestals supporting broadband wireless, which I generally thought was a really good idea until I read the first chapter of your book, and found out that it wasn't such a good idea.

Stephen Goldsmith: Now what we're going to do is Ben will tell you about his book, including what I did wrong with LinkNYC, and then Kathy will comment, and then we'll turn it over to all of you for questions. Ben will speak for about 20-plus minutes, then Kathy will speak, and then we'll open it up for questions. Please greet Ben Green.

Ben Green: Thank you. Thank you, Steve, for the introduction, and thanks to all of the Ash Center and other Kennedy School folks who helped to organize this event and put it on. Thanks to everyone for coming out. I'm really excited to be able to talk about this book and this project with all of you, and especially with Kathy and Steve, who have so much to add to this topic that I'll be talking about.

Ben Green: To give a little context of where I'm coming from, this book is really bringing together a couple different perspectives and experiences that I've had around cities and technology. The first is in my academic work as a computer scientist. I'm a PhD student in the computer science department here at Harvard, and I've spent a lot of time thinking about how to develop and explain and how to audit technologies that are being used in government, and as part of that work, went and spent a year in Boston City Hall as a data scientist. That was a project that was funded by the Taubman Center here at the Kennedy School. And throughout all of that, have also spent a lot of time studying the social sciences and the broader social impacts of technology to understand, from a non-computer-science perspective, what are these technologies doing? What does it mean when we are deploying technology in society, and how should we understand that?

Ben Green: The book really covers three broad topics, and I'll organize the talk around them. The first is this idea of technology as a solution for social problems. Where does that idea come from, and how can we understand it? We can think about the smart city as a manifestation of this type of vision. I'll then discuss some of the limits of technology for social change, why technological efforts to solve social problems can go wrong. Then, finally, I'll think about some of the ways that we can use technology well in cities and not just throw the entire project away, but understand how to bring together technological and other perspectives to address social problems.

Ben Green: We can start by just defining and thinking about what is a smart city. I think it's best to go to the source, one of the technology companies that are driving this vision, and we can look at Cisco. They write, "By definition, smart cities are those that integrate information communications technology across three or more functional areas. More simply put, a smart city is one that combines traditional infrastructure, roads, buildings, and so on, with technology to enrich the lives of its citizens." This matches the general description of what the word smart has come to mean across society, from the smartphones to smart homes to the smart toaster or the smart toothbrush, taking traditional objects and putting digital technology inside of them. Whether that's internet connection, data collection, algorithms, we're putting that inside traditional items and objects and processes.

Ben Green: In the smart city, there's a wide range of technologies that are part of it. It's not a singular vision of a specific technology or process, but an array of different types of technologies that get deployed in different cities and across different purposes and different agencies.

Ben Green: They range from sensors that are put up on street poles that can collect information about the surrounding conditions, these might be able to collect temperature and weather information, or collect information about the people who are interacting or are nearby through cameras and sensors and connecting to the digital devices in your pockets. There are self-driving cars, of course, automated vehicles that are able to drive around without a human driver. Algorithms and machine learning and artificial intelligence are a central part of this, taking all of the data that's collected by sensors and by other agencies to analyze that data in various ways to understand what's happening and forecast what will happen in the future, and various types of smartphone apps and other connectivity functions that try to connect people to one another and people to the city government.

Ben Green: All of these different technologies are deployed in different ways across cities and across different agencies in different functional areas, and so it makes the smart city both a hard one to pin down and also a very expansive one, that it's this broad vision of applying these technologies to cities, and every city has a slightly different spin on what that looks like.

Ben Green: But across the board, the smart city has emerged as this pretty consensus vision for what the future of cities and municipal governance will look like. From the federal government to major tech companies to city governments to federal and local governments around the world, the smart city is something that's being quite eagerly pursued and approached. In the United States, just about every city has some sort of smart city project or vision, and often brand themselves based on these smart city visions. Kansas City, Missouri, calls itself the world's most connected smart city. San Diego brands itself as having the world's largest smart city platform. This is a really important sense of the city's self-image and their vision for the future that they're pitching to their constituents and to businesses and to other places.

Ben Green: As we look at these approaches that are really centering technology as the solution to the whole host of urban challenges that cities face nowadays. We also have to start scratching beneath the surface and asking a couple of questions. Are the systems actually possible? Is the technology capable of doing what is promised? And if this technology is capable of doing that, would we want that to be implemented? Are the visions that are being promoted desirable ones for the future of cities? The core argument of my work and my book is that the answer to both of these questions is no. The smart city is a vision full of false promises and hidden dangers.

Ben Green: What are the limits of using technology as the centerpiece of solving these social problems? There are really three broad issues that I'll talk about. The first is how this approach distorts our understanding of what the problems are in the first place. There are limits to the technology itself and what it's capable of, and then there are fundamental aspects of the technological architecture, how the smart city technology is designed, that is able to reshape power and decision-making in fundamental ways that often don't get paid enough attention to.

Ben Green: The broad vision that I think of when I think about where smart cities come from is this idea of tech goggles, this particular vision of the world endemic to engineers, but certainly one that others share, adopted from the engineers, that sees every aspect of society as a technology problem, that wants to turn every problem into a technology problem and wants to find technological solutions.

Ben Green: There's really two broad myths that are underlying this vision. The first, that technology drives social change, that the way of pursuing progress is to implement new technology, and that that technology is what will prompt such progress. And second, that the technology is neutral and objective, that the solutions and progress that will be achieved is driven by technology and can be socially optimal.

Ben Green: We even have folks like the former president of IBM saying, "If the leaders of smarter city systems do share an ideology, it is this. We believe in a smarter way to get things done." There's this idea that technology and being smart transcends politics and broader questions of equity in decision-making and can just be decided on by everyone as a better way of doing things.

Ben Green: Now, we can begin to wonder about what sorts of visions come from this, and the simulation that maybe is a little bit difficult to see here is one vision of smart cities that's been put forward by some researchers that I think is a great encapsulation of what these types of visions do and where they go wrong. What this is trying to show is a city street intersection with self-driving cars and how with the advent of automated vehicles we can get rid of congestion, we can get rid of traffic lights, and let the cars sort themselves out. They will drastically eliminate congestion.

Ben Green: What's remarkable about looking at this vision is not the speed at which these cars are zooming through the intersections, but everything else that's not part of this simulation. There are no people. There are no cyclists. There are no buses. There's barely even buildings. It's hard to even imagine that this is a city at all, and it's hard to imagine who might even be using those crosswalks that are so clearly painted in the middle of the intersections.

Ben Green: What's remarkable, in particular, is that this is not just an abstract depiction of any intersection or rural highway, freeway interchange, but is actually a simulation of a specific intersection in downtown Boston that looks like this. It's the intersection of Mass. Ave. and Columbus Ave. around the South End. If you go out to that street corner, you'll see people walking around. You'll see tons of cyclists. It's right along the route of the 1 bus. There's this entire context of what's going on here, both in terms of mobility and also in terms of the broader social structure of the city. This is really at the epicenter of Methadone Mile, which is the epicenter of Boston's opioid epidemic.

Ben Green: There's this incredible amount of both mobility and other social contexts that's happening here that has been completely erased to create this optimization. By eradicating this complexity, we are getting a solution of this street without traffic or congestion, but in turn, we're erasing everything else that we might think is valuable on these streets.

Ben Green: I think of this as a process of distortion, where the lens of tech goggles looks at a city street like this and instead sees a vision of an urban streetscape really dominated by a vision of technology, of discrete objects that can be modeled and predicted and analyzed in a computational manner. Rather than recognizing the complexity and the politics of these aspects of urban life, smart city idealists describe cities as abstract technical processes to be optimized using sensors and data and algorithms. The process of solving these abstracted versions of social problems often creates more problems than it solves, unexpected problems, and has a great deal of underlying politics in terms of what gets decided to be modeled and what gets decided to be ignored in these things.

Ben Green: I'll talk about a couple specific examples of technologies in some of the work that I did in Boston and some of the broader landscape of smart cities that shows the limits of these visions and where they can go wrong and the deeper underlying issues.

Ben Green: One of the first lessons that we learned in Boston really clearly was that social problems are not technology problems. One of the projects we were working on was helping the city do a better job of disseminating and sharing open data with the public, datasets that we were collecting from agencies and other departments and sharing that information with the public so that they could do something useful with it.

Ben Green: Rather than just staying inside city hall and thinking about what to do, we went out and tried to talk to people across the city and went to a bunch of different libraries to catch just the average passerby and say, "What information are you looking for? What sorts of questions do you have? What sorts of data would you like to see?" What we found was that these conversations were incredibly brief. We would stop a person and ask them what data they'd like to see from the City of Boston, and their faces would go blank and they would very politely try to end that conversation as quickly as possible.

Ben Green: But it's not that they were rude and didn't want to talk to us. We just weren't asking the right questions. When we stopped them and we said, "How do you like the libraries? How long have you lived in this neighborhood? What sorts of things are you concerned about?" a very different picture emerged. We had long, really rich conversations with folks, and a whole range of issues, ranging from policing to affordable housing to buses and transportation and education, all of these things came up. But what became clear was that these were not fundamentally problems that were clearly connected to the data itself or to any sort of open data that we could provide to them. As one Boston resident put it, "Information is fine, but I want a way to influence what's happening."

Ben Green: It's very clear that technology does not just magically empower people. There's this vision around smart cities and open data that, by putting the data or other technology in people's hands, we will create a more democratic society that will magically empower people. That vision, as we saw here, and in many other cases in smart cities, fails to actually account for the particular dynamics of what are these problems and what are the ways of trying to remedy them.

Ben Green: Another lesson that we encountered in Boston was that data can be misleading or biased. One of the projects I was working on was trying to use data to inform how we repair sidewalks. There are, obviously, tons of sidewalks in the city of Boston, and the city has a limited budget for how to repair those sidewalks, so we were trying to use data to say, "How can we prioritize our investments towards the sidewalks that most need it, that will most benefit people if we can repair those sidewalks?"

Ben Green: Traditionally, what the city had been doing was looking at requests for service from the city's 311 app, so looking at places where a member of the public had taken out their iPhone or other smartphone, taken a picture of a broken sidewalk, and submitted that picture requesting service for improvements. But we wanted to know, does that actually tell us where we should be doing the sidewalk repairs? Is that a good guide?

Ben Green: We compared the requests for service with another assessment that we had done of sidewalk quality, and what we found was an incredibly different image of the city. On the left here is sidewalk quality, where the heat map, the red blob, show the lowest-quality sidewalks. So we can see that there are sidewalks in need of repair pretty much across the city, although clustered in certain neighborhood. Then we compared that to the map of where requests for service are coming in for repairs. If you look at that map on the right, it's entirely in downtown Boston, Back Bay, South End, Financial District. That paints an entirely different picture. Had we simply followed that data, we would be focusing all of our repairs in one neighborhood, the neighborhood with the most resources, the most well-off constituents, and ignoring the rest.

Ben Green: What we realized was that this data is telling us something very different than what we expect, that we might think the information is telling us about where sidewalks need repair, but what this information was actually telling us is where are there clusters of people who have smartphones and think that this is an issue that they should get fixed and that they can trust the government, that they will submit the photo and that problem will get fixed. When you think of it that way, you're seeing a very different portrait, and a very different outcome emerges.

Ben Green: You have to be very careful. All of these cities that are trying to rely on data, really, as the backbone of this technology have to be incredibly careful about understanding the data within its broader context to understand what it actually means and not just what we think it means or what we might want it to mean in an ideal world.

Ben Green: Then there are questions of the architecture of this technology and the way that the design and structure of the technology can reshape urban power and politics without being explicitly intended to do so or without many of us even realizing that's happening. Much of the technology, as I've mentioned, of the smart city relies on collecting data about the public, and that data means that someone is collecting that data and it massively is expanding the reach of surveillance, both for law enforcement, who has access to this information, as well as for private companies through projects like LinkNYC that, while in some ways provide a valuable benefit, are also operating behind the surface by allowing law enforcement and companies and other agencies to collect incredible amounts of data about the public.

Ben Green: Then there are broader questions of privatization and decision-making. Where much of smart cities rely on public-private partnerships with companies who are able to provide the technology and provide the sources to develop and manage the technology, but then often the decisions about what the technology is doing and what sorts of assumptions and values are being baked into the models and the apps themselves and the infrastructure themselves are handed over to these companies, who are unaccountable. Typically, it's impossible to get information about what the technology companies are doing because of trade secrets and other protections. So much of the decision-making that was typically the authority of public bodies is being shifted over to private bodies.

Ben Green: A particularly salient example of this is in Toronto, where Sidewalk Labs is operating a project to develop an entire city neighborhood, where Sidewalk Labs has asked for control of this entire development project, even gone so far as asking for some of the tax revenue, and even before given the license to do so, expanding the reach of their development plans by an order of magnitude in terms of how much land they're expecting to be managing.

Ben Green: There are these broad questions about what are the structure of how these smart cities are operating, and what are the ways that we're bringing in technology, and who's controlling that technology?

Ben Green: Now I want to shift to thinking about, how can we use technology well? What are ways of reorienting these relationships, understanding these limits, in a way that can help us actually solve some urban problems in real ways, while understanding the limits of technology, but also bringing the opportunities to bear in a responsible way?

Ben Green: The shift that I advocate is the framing of the book, is two words, smart enough cities, so inserting this word enough into the framing here. The key idea is that the language of smart cities prevents a very singular, narrow axis of progress. The smart city is one where there's more technology, and a smarter city then is one with more technology. Doing better along the scale of a smart city is solely focused on the technology, as much as there are often peripheral aspects that people will discuss in relation to that technology.

Ben Green: The goal of bringing in this word enough is to change the focus from the technology as an end in itself, but as a means towards broader outcomes around justice and equity and mobility, and really framing the question of "What are we trying to accomplish away from technology?" towards "If we're trying to be smart enough, what are we trying to be smart enough for? What outcomes are we trying to pursue?"

Ben Green: I'll talk about two specific visions of what this looks like in practice, what different cities are doing to make better use of technology, both in terms of outcomes for residents and in terms of the internal infrastructural processes that they're managing to inform and work with data more effectively.

Ben Green: The first story is in Columbus, Ohio, which, in 2016, won a $40 million grant from the Department of Transportation to create a first-of-its-kind smart transportation system. What one might expect when they hear this sort of thing, smart cities vision, transportation, is a vision of self-driving cars zooming through town, everyone having access to Uber and all of those things, just seamlessly as possible to get around. But Columbus didn't focus on the technology, but instead focused on the ways in which technology and mobility are connected to social mobility and social outcomes related to equity, and then focused on what technology could do within that.

Ben Green: Columbus had a couple different things that they had been working on for a long time. The first was reenvisioning its urban planning strategy. Over the last century, the city had really pursued a vision of sprawl, growing outwardly while neglecting the urban core, and had spent a long time doing a 30- or 50-year visioning process that said, "What do we want the urban planning of this city to look like? How does that connect to mobility? How can we promote more mixed-use and dense urban development over the next 30 years?"

Ben Green: Then the second piece was focusing on high infant mortality and poor prenatal healthcare in some of the poorest neighborhoods in the city. There are certain highly segregated neighborhoods where they had been really focusing on improving access to healthcare, access, in particular, to prenatal and postnatal care for mothers.

Ben Green: Both of these initiatives they brought to bear on their focus on technology. Rather than jumping straight to the tech, they thought about "What are we doing? What are the goals that we're trying to accomplish? How can technology help us?"

Ben Green: They went out and talked to the residents in these neighborhoods to really understand what are their challenges, what are their needs, and they found a much more complex and thorny set of issues than they might've expected. It's not simply access to a self-driving car that these mothers might need, but they need better access just to information about the public transit, better ways of paying for these systems, especially Uber and what not, if they don't have access to credit cards. These systems typically rely on smartphones and credit cards to pay for them. Even childcare, so that they can have their other children taken care of while they go to the doctor or while they go to a job interview.

Ben Green: By looking at it from this broader perspective, actually understanding the wide range of issues that people were facing, they were able to come up with many different solutions, none of which seemed your flashy, smart city, amazing solution, but together were much more holistically able to address some of the issues that residents were facing.

Ben Green: The challenge for the city was to understand, where are these units, which ones are infected, which ones have we cleaned, all of that. The problem was that the city didn't actually know. There was no comprehensive dataset of where these cooling towers existed or which buildings had one. In the midst of this outbreak of this disease, this crisis that they needed to manage as quickly as possible, there was not just a typical public health emergency, but really a data emergency, a question of how can be bring together data to paint a picture of where we need to go, what's going on, what's our status. The problem was the city lacked this data, and even the different pieces of information that different departments had were completely disconnected from one another. They were not organized in a way such that they could talk to each other and allow someone to create the comprehensive picture from these datasets.

Ben Green: Out of this process, they were able to pull things together and ultimately use some algorithms to help address the crisis, but out of this process that they had, they realized what they needed to be doing was really focusing on the internal day-to-day ways of managing the data and technology infrastructure. It wasn't just a question of collecting more data, because it would be impossible to know exactly what dataset would be needed in a future crisis, but really bringing together and improving their knowledge of how do we work with data, how do we collect data in a way that makes it amenable to analysis, and how can we make sure that that information is able to be shared across government?

Ben Green: One of the things that they did was to create a form of what they call data drills, essentially fire drills, but for city departments trying to work with data, thinking about, rather than just coming across these issues in a real crisis, creating a simulated crisis where everyone gets together for two days and learns what this data can tell them, how to bring it together, and how to work with it in a fast-paced environment, and actually understand how to use it well, as well as creating various more comprehensive systems that are able to bring different datasets together from across agencies that normally don't work together and help them paint this broader picture of data and help them use it effectively.

Ben Green: I'll stop there and turn things over to Kathy for a response. Thanks.

Kathy Pham: Okay. Thank you, Ben. I really like the slide you had for this one. I want to keep it there for a little bit.

Kathy Pham: How many folks in the room have either been vendors or contractors, where you were the one in the company selling to a government entity? Has anyone been? A few folks in the room. Then how many of you have been in some kind of a leadership or government role where you were the ones buying or influencing buying of the technology? A lot more folks. Conversation will be awesome afterwards.

Kathy Pham: Ben had asked if I would come today, and my first question was, well, which aspects of my background do you think would be interesting for this? I spent about a decade building technology in the private sector at Google and then, in 2014, went into the public sector building out something called the United States Digital Service, which is a tech startup inside government at the White House. These days, since two years ago, I spend a lot of time thinking about ways we can build more responsible technology in the private sector, and then as a public sector, ways we build technologies that don't fail the society we're trying to serve.

Kathy Pham: I spent some time at IBM and saw how contracting works and how big companies will sell products to governments, whether or not they actually work, but they just sell them anyways, and governments buy them, and after some $10 million and five years, no one is better off. And those of you who have bought technologies for government, that probably sounds familiar.

Kathy Pham: Then, also, saw how inside tech companies, we would build technology, and our whole task is to just build really, really good technology to let you do whatever it is you want to do. What I mean by that is we want to empower you to collect more data, we want to empower you to have the fastest car to go through a city unassisted, without the understanding of the social climate that you are going to parachute into, because that's not our task.

Kathy Pham: When I think about Ben's book and these three points of technology as the solution, that's what we... My background is also in computer science. That's what we learn in engineering school. We build technology to be the solution. The societal implications, the economic climate, the political climate around us, put that aside. We build technology.

Kathy Pham: The limits of technology for social change. How many of you have worked in a big tech company, a private-sector tech company? A few. There's this general sense that your tech is changing the world. If you want into and just sit down... Maybe not so much in the last two or three years because of the state that tech is in right now, but before that, if you go sit in any of the big tech companies, people are there because they want to change the world. It's this idea that you're building technology and you're changing the world, but oftentimes very little recognition of the limits of that technology.

Kathy Pham: Then using technology well. We just default to many of the things Ben said.

Kathy Pham: I think what I really, really wanted to leave you all with or to think about, it's really complicated to have someone... maybe not complicated. But today, if we're in a government, we don't really have this ideal team where the government employees themselves are people who deeply understand technology, deeply understand data, deeply understand how to even do human-centered design, user experience, etc., and understand government, understand policies, understand the community. We really need all those pieces in the same place to really know what it takes to build a smart city or a smarter city or a smart enough city.

Kathy Pham: Otherwise, what you're left with is you have some tech company going into some government saying, "I have all these ideas. It's so great. You can use this technology to solve all of your problems. You'll take homeless people off the streets. Everyone who should get benefits will get access to social benefits. Any student that wants to go to college can get information at any point in time to tell them exactly what school they want to go to. Every single veteran in the United States of America will get access to healthcare. We have the technology to do this for you," all the things I personally have heard at some level of government, either in the state level or the federal level.

Kathy Pham: When you don't have people informed inside government to make those decisions, which we don't really have right now, we end up in the cycle where we buy technologies that might not work, then we try and deploy them to our cities, and we ultimately don't really solve the social problems we're trying to buy the technology to solve. Oftentimes, we propagate the problem. In the United States alone, we spend $86 billion on federal IT. That number may seem a lot or low, depending on, I guess, what sector you really work in, but 94% of that... This is the number from, I think, 2017... either are delayed, never delivered, never see the light of day, or just don't work.

Kathy Pham: The smart city stuff falls in that same category, and it's complex why it doesn't work. Part of it is not having enough expertise. Part of it is vendors preying on governments. Part of it is the procurement laws we have in this country. But it's really good to remember and think about that when someone is selling something like a smart cities platform to just make all of our cities smarter and better and make lives for our citizens better.

Kathy Pham: What else? There was something else you said that I wanted to make sure I-

Stephen Goldsmith: Well, Kathy, while you look at that, let me ask you and Ben a question.

Kathy Pham: Yeah.

Stephen Goldsmith: Starting with Ben, then we'll go to audience questions, and then if you think of it, feel free to chime in. Some folks have to leave at 1:00 for classes.

Kathy Pham: Yeah. I'll just sit down as well-

Stephen Goldsmith: Ben, I'm just going to ask one. We run a program here at Ash called Data-Smart City Solutions, where we try to evangelize officials to do the things you caution against. An alternative definition of smart city might be how smart officials are, and the community, in using data to make better decisions, whether you think about the biases in the algorithms or you think about everyday activities. I'm not disagreeing with the comment you and Kathy both made about context

Stephen Goldsmith: But what about the argument that the way we act without being a smart city, without the data, just perpetuates the status quo, perpetuates, say... Let's just take an example. There's a lot of legitimate complaints about biases in algorithms, but there ought to be a lot of complaints about biases in everyday activity that isn't evaluated by the algorithm. So my basic question is let's think about this as a smart city defined by smart decision-making, and then how does that affect your three-part analysis?

Ben Green: Yeah, that's a really interesting question. I think there's a lot of different pieces that I've thought about in that dimension quite a bit.

Ben Green: I think that the broadest question there is this question of change. What is our comparison point? I think that one of the things that's definitely out there when you... A lot of the smart city conversation is set up against this dichotomy against the dumb city. There's a sense "Well, if we're not doing this, then you're a dumb city. You're stuck in the past. You're against innovation. You're against progress." I think that that framing of the smart city is incredibly narrowing and incredibly limits our understanding of what the possibilities are for change.

Ben Green: The comparison to me is not between "Is this specific type of technology better than doing nothing?" which sometimes it is and sometimes it isn't, but "What are the broad range of things that we can try to do? What are the many opportunities that are available to us as pathways of innovation? Then what role can technology play within that, and how can we shift our thinking around the role of technology away from the 'We're going to take the technology; we're going to put it onto this problem,' to a different way of thinking about it?"

Ben Green: I think that the point around smarter decision-making and informing public officials, I completely agree with. I think a lot of that is how Iwould frame... falls very much within the last story I was telling about New York, about bringing this data to bear for public officials in a new way.

Ben Green: I think that I see a lot of trying to reclaim the word, what smart city means, and I agree with many of those alternative visions that are put forward, but I think that just what it fundamentally means is this vision around technology. If you look at where the vision comes from, in many ways, it's one that's been developed and pushed forward by the multinational technology companies, especially early-day Cisco and IBM, and now Google and others. I think just the framework of smart city, as much as we may try to redefine what we're applying smart to, or what is smart, is hard to get out of that still technologically-centric mode of thinking.

Kathy Pham: Yeah, I really appreciated you bringing up the point of this is the status quo in real life. We have real-life bias. We have scenarios in real life where there are things in, for example, there are streets that need to be fixed, there are potholes that don't get paved, there are people who don't get access to care, etc. Therefore, we are looking for all these solutions to figure out how to solve them.

Kathy Pham: I build technology, and I'm going to continue to probably do so for a very long time, because I believe that technology can really help us in many ways. But at the same time, I think when we blindly buy technology without really understanding the impacts it can have, we can make those same problems a lot worse rather than make them better. I think one of my biggest concerns is we don't necessarily have the people in our governments to know how it can make it worse or how it can make it better, and we end up buying something and it actually just makes things worse for us. So why spend $10 million on something only to have it worsen what we already have?

Stephen Goldsmith: Questions, the audience? Jerry, you've been a technology face from my past, so you've taught it, you've sold it, you've evaluated it. You've been on all sides of this for 25 years.

Jerry: 25 is a calm number. There's two things I was thinking about as I was listening to this. One is that, yes, in any analytic process, we define a problem and it simplifies it, so there is a danger that we so have zeroed in on the wrong problem, and therefore the data we collect can be very misleading, as you were showing. In many ways, I agree with you, but also would say that set of problems have been around with us for a very long period of time.

Jerry: What seems, to me, important and not necessarily addressed in what you were looking at, and might or might not really be important, and I'm back to Steve to see on this, at the Kennedy School we often talk about when is it that people want to work together rather than working on their own, because if you're not needing to work with somebody else, many people would rather just do it themselves. There are three criteria for it works well as a group. We're more productive as a group, and productivity... I was a budget director, so I often ran the numbers to try to say per dollar spent over a certain period of time on a certain kind of problem was more productive.

Jerry: However, much of the real-world decision-making goes on the other two factors, and those are, do we think the decisions are fair? Do we distribute the results to the right sort of people? That is a negotiation among people that we fight over, and fight very fiercely over, and sometimes people would rather not even analyze that problem because even talking about it may make it harder to solve. We know it's hard to solve because we've set up an authority structure that will get us a decision when we need it to. The military forever has known that if you're in the middle of a battle, that's not when you say, "Well, let's stop and call a meeting and get a consensus." You have to have a fast response.

Jerry: I'm sorry to go on on this realm, but it seems to me we need to analyze, particularly as technology goes from just calculating how we get to the moon or calculating the county, which is a task, not an entire structure of an institution in a society, we're now starting to engage so many more people that the bigger issues are becoming equity and trust that the authority is right. So I would be very interested in seeing, as we seek to apply technology more and more, are we getting better ways to think about priorities in resolving issues that are issues of equity and trust, in addition to the traditional issues of productivity?

Jerry: The last thing I'll say is, going back to Steve, years and years and years ago, when I was pushing technology largely on the productivity ground, as a politician in a Midwestern state, he was suggesting that most of his audience didn't ever believe that government would be more productive, but did see the coming of jobs leaving the Midwest, as they have, and worried a lot about that issue. That was the frame that he put on it in talking. Instinctively, as a politician, he was looking very directly at the equity issues and the things that would motivate people to understand. That he was making smart decisions in productivity wasn't as important as competitiveness, which people understood.

Stephen Goldsmith: Ben, putting aside the app-driven calls for service in Boston for a second, which I thought that was really interesting, but respond to Jerry in terms of how you think technology is either going to aggravate the equity issues or mitigate the equity issues.

Ben Green: Yeah, the questions of equity and trust are really the central questions of smart cities. I completely agree. I think that there are things... It really pushes in both directions. I think that much of what I was talking about in terms of algorithms that could be relying on biased data, systems that are massively increasing surveillance, and systems that are handing over control to the companies, those are places where I see issues of equity often getting worse from technology, even technologies that may be billed as addressing some dimensions of equity, whether that's applications in policing or other contexts, or in welfare. There have been examples of algorithms that are meant to make these systems better, but because of the interactions of both what the algorithm is doing and how people are interacting with those systems, the equity problems actually often get exacerbated.

Ben Green: There are, however, a number of ways in which I see the public really trying to put forward different ways of... Then there's questions of authority. What I see as the major shift in terms of creating better outcomes along the equity and trust dimension is about questions of authority. Shifting these outcomes towards...

Ben Green: One of the major trends that I've really seen has been public pushback against the surveillance, against this technology, that has led to various forms of ordinances in cities, Cambridge, Somerville, San Francisco, Oakland, Berkeley, that are shifting the decision-making authority away from just an agency itself around surveillance technology or algorithmic technology, and towards a broader public input into these systems, so saying, "Before you develop this system that's going to be putting sensors on every street and collecting a ton of data, there has to be a city council hearing, there has to be a public comment, you need to do audits on those systems." That has, even in Boston, or I think it was in Cambridge, where the ACLU showed that having that in place, even just for... it's only been enacted for about six months, has limited the police department's use of some technologies and limited their procurement of certain technologies

Stephen Goldsmith: Kathy, I'll get you to respond, and we'll take another question. You think about data and equity, so let's go to Ben's map of 311 app calls versus the street conditions mapped otherwise. That doesn't strike me as technology aggravating equity. That strikes me as a lack of imagination on how to use the data that's around from... Think about examples in your work where data enlivened a public conversation to resolve the equity issues, as contrasted to merely aggregated the equity issues.

Kathy Pham: I think you actually hit a key point there. Where oftentimes data should be used to enliven a conversation, where this is a dataset, these are all the 311 calls, here's a map of Boston, and someone who knows the region will be able to tell you pretty quickly the nuances of Somerville, Boston, Roxbury, Cambridge, Cape Cod, etc., but someone who's removed from the situation may just end up making a decision about the city without understanding those nuances.

Kathy Pham: You see cases like that with, for example, Amazon Prime, where they decided based on their data they weren't just going to deliver to, I believe it was Roxbury or Dorchester... It was Roxbury. But no one in Seattle understood... Anyone who would've looked at that map would've been like, "We know what you're doing," but people who don't are just data driven and make decisions off the data.

Kathy Pham: If, in an ideal world, you have a dataset and it enlivens the conversation, and it drives conversation to make decisions on what you should do, that's great. Oftentimes, I think what we'll see is the results of the data make it into some product decision without, really, checkpoints along the way to say, "Oh, wait a minute. That looks a little weird." I think there are cases where people are able to look at the data, but we oftentimes, in government, probably because of the lack of understanding of what to do with the data, just take it at face value, then go do something with it, and then, to Ben's point, make the problems in equity even worse.

Ben Green: Let me-

Stephen Goldsmith: Mark? We got a whole bunch of hands. Let's take-

Kathy Pham: Some students. That's all.

Stephen Goldsmith: ... two or three questions, and we'll come back. Is that all right?

Ben Green: Yeah, sure.

Stephen Goldsmith: Mark, and then we got a couple back right here.

Mark: This issue of data in equity is a very important one, and I think we're talking about equity in not a particularly differentiated or useful way. One idea of data in equity is whether people have equal opportunity to participate in the process of decision-making about how public assets are going to be used. It seems to me that the use of digital technology, along with social media and stuff like that, holds great opportunities and great perils for equality in political discussion and discourse and stuff like that, as well as the technical quality of what's being discussed within those realms. Equity in decision-making.

Mark: There's another kind of equity which has to do with whether data helps us see the degree to which the things that we're producing at the individual and the collective level for citizens are fair or not, because they allow us to observe differences among individuals and between geographic areas, both at the individual level and at the aggregate level, and therefore to incorporate in our judgments about whether something is good, efficient, and effective, also an idea about whether it seems en route to the production of a more just as well as a more prosperous society, so you can imagine. Participate equally, see the results.

Mark: This, I think, also then connects up very importantly with the question about what data and algorithms. Can you put your slide of the taxi cab... Not the taxi cab. The pavement one?

Mark: One way to observe this, it's kind of an interesting way to think about it, because what you've got on the right-hand side there is, in some sense, effective demand, understood as client services. A good government service is one that responds to customers, and the customers show up with demands and political power. What you have on the left is professional judgements about what represents a good sidewalk.

Mark: So in the second picture there, we have a professional's estimate of how evenly distributed good sidewalks are, and on the second we've got a consumer vision of what good sidewalks are, where just as consumers guide markets because they have more money, consumers with more confidence in political power, as you mentioned, can guide markets if we pay attention to them, to their purposes, rather than others through the political system. I have used this as a way to describe why one of the bad ideas associated with technology would be to empower clients.

Kathy Pham: Yes.

Mark: That in some sense, part of your idea about equity would be to empower a global professional view of what's happening and then to speak for the people that haven't been represented adequately, either at the client level or at the citizen level, in the operations going forward.

Mark: Last point, and this is the one that's most troubling for me, is that if you look at... I do a lot of work in policing and child protection and a lot of things like that. Gradually, the algorithms are, essentially, critically important devices for prioritizing particular populations that are to be engaged in an effort to improve conditions. We're always trying to distinguish high-risk people from low-risk people in that area. That nearly always involves the conception of "a profile."

Mark: Notice that we construct that for two quite different reasons. On one hand, when we're thinking like trying to become more efficient and effective, we're trying to think about a triage system that allows us to get the resources to the people who could use it the most or benefit from it the most. When we're thinking about it from a justice point of view, we're trying to think about people who would need or deserve it the most.

Mark: Virtually every service delivery scheme in the government has some eligibility standards that are set up to just say, "These people deserve it, and these people don't." They are also then shaped partly by our concerns for efficiency and effectiveness, which is "Let's make sure the resources go to the people who are most needy." Every single one of those systems of sorting people are going to be making errors of two types, and sometimes they'll be racially associated and not. Getting people to understand that profiles are everywhere, and they're used for the purposes of both efficiency and justice, and their quality has to be adjudicated on those things, is a critical part of applied uses of technology.

Stephen Goldsmith: Ben, let me get a couple student questions. Yes? Thanks.

Katie King: Katie King. I'm a student here, but also I used to work at a Boston company. I'm curious about both of your perspectives on data that is not originated in the public domain, but in private ones. I'm thinking about Uber and Lyft in our transportation planning. I'm thinking about autonomous vehicle crash data, about public safety.

Katie King: Hi. There's a tension between data that private sector has and their business interests, and that it could be put to good use for public entities. Any thoughts on how to productively have that conversation now about privately held data?

Ben Green: It's funny, I don't know if you... This is something the City of Boston was working on. I don't know what years you were there, but we actually had... Jascha Franklin-Hodge, who was the CIO, had a big fight with Uber because they had done a whole PR blitz around sharing data and then didn't actually share much, and so there was a big fight.

Ben Green: Now I think the two tensions, there's both the opportunity of what can this data share and tell us, and then on the flip side are what are the privacy implications of this incredibly detailed, granular data about where people are traveling all the time. Los Angeles has created a new mobility data standard for data sharing. This is a place where I see city governments have their value in city governments actually having their licensing and permitting fee contingent on some form of data sharing. They have the authority to do that type of thing. But at the same time, there are some very tricky questions about what data is shared, who manages that data to ensure that all of Uber's data doesn't just end up in the public domain, which we also wouldn't want to happen.

Kathy Pham: I love this topic in general. I feel like we're right at the beginning of trying to figure this out. There is a group called Data & Donuts Vanessa Rhinesmith runs, where they talk about some of these topics, because on the one hand you're like... I worked for a company that pretty much probably has one of the largest datasets in the world, which is Google, and on one hand you're like, "Some of this data should be public, whether it's Uber or Lyft or some other data, or should at least share it with government," but then what does that mean for privacy and sharing with governments? And if they do decide to do that, what does that look like? What kind of violations do we have there? I don't think we really have rules or even policies in place to really deeply understand that yet, and so I think we're just right in the beginning of figuring out some of these answers.

Kathy Pham: There are companies that are now opening up their datasets to academic researchers via more official IRB routes, where you take a lot of care with the data. But you can also imagine a case where, I don't know, one of these companies shares information with some department somewhere, and then now really marginalized communities get exploited because government now knows a lot more about them. I don't think anyone really has the right answer yet. We're at the beginning of having that conversation.

Stephen Goldsmith: Other questions?

Speaker 8: In addition to the smart technology, smart cities conversation that we're having, I think there's another tier, and it alludes to a lot of the equity things that we've been talking about. Underneath that is the smart infrastructure. A lot of municipalities all across the country right now are being first-stage asked from communications companies to start installing 5G, and what are the implications to that in the landscape and, as far as this heat map in particular, equity?

Speaker 8: If these companies are allowed to only go where the money is going to give them the most bang for the buck initially, we're just going to double down on inequities we're seeing in the data because we're not capturing as much or we're not able to deliver better technological products because that kind of infrastructure is lacking in certain geographic places that aren't represented well. What should municipalities being asking these companies for in respect to data sharing to begin with, and anything else, when we start talking to them on these contract levels?

Ben Green: Yeah, I think that one of the places where city governments have almost failed to recognize the power that they have with regards to technology is letting the technology companies dictate the terms of what they're doing, Uber being the most egregious example of this, just coming in without any permission, without even asking, and then just really running ramshod over cities. I think that city governments have learned their lesson a little bit in realizing, especially larger cities, like if you're in New York or San Francisco, you have the ability to walk away from the table or to enforce these standards.

Ben Green: I think that as the conversation has shifted away from "Technology is amazing. Technology is going to solve these problems," towards "Technology has opportunities, but also there are a lot of risks, and we have to really understand where those dangers are," the city governments are becoming a lot more thoughtful and are being willing to try and set those boundaries on the terms, and use that negotiation process as a way of trying to create the right environment for the public interest and the private interest to be aligned, and limiting the places where those interests are going to diverge.

Kathy Pham: Do you remember the... Susan Crawford and Maria's work. What do they call it?

Ben Green: Fiber.

Kathy Pham: There was something else. There was a title for either a book or a paper they wrote. Susan Crawford over at the law school, she has an appointment here. Are you familiar with Susan's work? They look a lot at just tech deserts in general, putting aside all the smart stuff up here, the infrastructure that underlies our entire country, and how you can live on this line and have access to internet and be able to apply to college and do work, and then you sit on this line and none of that applies, and the trickling effects that has on people's livelihoods. We don't talk about that enough.

Kathy Pham: When we let companies, especially private-sector companies... Google Fiber, for example, is a great example, where call it a vanity project, call it a deep desire to change the world, it could be any one of those things, but they were given this task to go and transform internet for the whole country, and in some cases kind of failed. But what's different from the public sector is there's not the same deep sense of responsibility to the citizens as there should be, and so now it can be quite dangerous to have these companies go in and determine what the technology looks like, which translates to the way people can or can't operate in their day-to-day, which translates to their economic stability, etc.

Stephen Goldsmith: Time for one more question or two? Well, a couple more questions. We'll start back here, and we'll go around there and take these two more, and then we'll finish.

Speaker 9: How can we bridge the gap between the knowledge and perspective of technology workers or engineers and the people in the government who are buying their stuff?

Speaker 10: Fascinating presentation. I was just curious about whether there are efforts to have roles on what information is disclosed about what technologies cities are using, how it works, and what data they're collecting.

Stephen Goldsmith: Last question, please.

Martin: Hi, my name is Martin. When you first put that slide up, the first thing that popped into my mind is "Here is a beautiful civics education opportunity." To keep the question short, I'm just wondering if in any way did this make it back into the Boston Public Schools as part of their civics education program for citizen engagement.

Stephen Goldsmith: All right, folks.

Ben Green: I'll work backwards. This has not worked its way back into the civics education, although there are definitely lots of cool ways they could do so, though it did end up shifting the way that the city brought in much more different types of data and much more public engagement to inform this process. So it did lead to a completely changed internal process for managing the system.

Ben Green: One of the major shifts in the public access is through these various task forces and ordinances that are enforcing transparency and accountability of these types of systems. For example, the surveillance ordinance that Cambridge has requires the city to publish all of the surveillance equipment and software that it has and do impact assessments and release much more information. One of the major trends that's happening is policies that are actually enforcing this type of transparency and dissemination of information about what they're doing, what companies they're working with, what data is being collected, all of that stuff. There's definitely a trend in that direction.

Ben Green: Then maybe I'll let you take the last one since that's mostly your domain.

Kathy Pham: The question about how do we address better-informed policymakers or people just buying technology in government. Again, we're behind, and we're at the beginning of that. We have things that are available now that are, I call it band-aid fixes, like the TechCongress fellows, where technologies will rotate in through Congress for maybe a year or two to inform Congress and lawmakers on how to make technology decisions. So instead of being a lobbyist, you can go be inside government and help make decisions.

Kathy Pham: We have organizations across the world similar to the United States Digital Service, the UK, Australia, Estonia, Taiwan, so many countries where technologies are coming to work inside government, and they're working to build technology, but because they're in the room, you end up answering questions about all these things. You cancel contracts that don't make sense. You call out companies that are making promises that don't exist. Right now, it's AI and blockchain, they'll have some promise to government, and having a tech person in the room just to call upon really makes a difference as well.

Kathy Pham: There's a whole entire public-interest tech movement that's existed for a long time, but now there's a lot of foundation money that's thrown at it that is exactly looking to address this. Schools like the Kennedy School, I think only just four years ago, decided that this was important. There's an open faculty position for the tech policy person. Really, all of you students are in the position where you can really help shape what that really, really looks like.

Stephen Goldsmith: All right, let's thank our speakers, please.

Presenter: You've been listening to AshCast, the Ash Center for Democratic Governance and Innovation's podcast. If you'd like to learn more, please visit ash.harvard.edu, or follow the Ash Center on social media at HarvardAsh