Policy Briefs

The Analytics Playbook for Cities: A Navigational Tool for Understanding Data Analytics in Local Government, Confronting Trade-Offs, and Implementing Effectively

The Analytics Playbook for Cities: A Navigational Tool for Understanding Data Analytics in Local Government, Confronting Trade-Offs, and Implementing Effectively

Abstract:

Amen Ra Mashariki and Nicolas Diaz, August 2020 

Properly used data can help city government improve the efficiency of its operations, save money, and provide better services. Used haphazardly, however, the use of analytics in cities may increase risks to citizens’ privacy, heighten cybersecurity threats, and even perpetuate inequities.

Given these complexities and potentials, many cities have begun to install analytics and data units, often head by a chief data officer, a new title for data-driven leaders in government. This report is aimed at practitioners who are thinking about making the choice to name their first CDO, start their first analytics team, or empower an existing group of individuals.

Download the full report

Full Text

Full Report Text

About the authors

Amen Ra Mashariki is the Global Director of the Data Lab at WRI. There he works across programs, international offices and centers to identify data solutions to turn big ideas into action to sustain our natural resources—the foundation of economic opportunity and human well-being. Amen Ra is also a Data + Digital fellow at the Beeck Center for Social Impact and Innovation where he partners with academic, public and private sector thought leaders to shape best practices and strategies on how to use data to maximize impact in our communities.

Prior to this, Dr. Mashariki has served as Head of Machine Learning at Urbint, adjunct faculty at NYU’s Center for Urban Science and Progress (CUSP), Fellow at the Harvard Ash Center for Democratic Governance and Innovation, Head of Urban Analytics at Esri, Chief Analytics Officer for the City of New York, Director of the Mayor’s Office of Data Analytics at the City of New York, White House Fellow, and Chief Technology Officer for the Office of Personnel Management of the United States.

Amen earned a Doctorate in Engineering from Morgan State University, as well as a Master and Bachelor of Science degree in Computer Science from Howard and Lincoln University respectively.

Nicolas Diaz Amigo is a graduate from the Master in Public Policy program at Harvard Kennedy School. He has had research and fellow roles at the Bloomberg Harvard City Leadership Initiative and at digitalHKS. Professionally, he served as the first Coordinator of Public Innovation at the Mayor’s Office in Santiago, Chile, City Hall and has consulted for cities and local government organizations across the world on the topics of public sector innovation, data analytics, digital transformation, performance management, budgeting, and process improvement.

About the series editor

David Eaves is a lecturer of Public Policy at the Harvard Kennedy School where he teaches on digital transformation in Government. In 2009, as an adviser to the Office of the Mayor of Vancouver, David proposed and helped draft the Open Motion which created one of the first open data portals in Canada and the world. He subsequently advised the Canadian government on its open data strategy where his parliamentary committee testimony laid out the core policy structure that has guided multiple governments approach to the issue. He has gone on to work with numerous local, state, and national governments advising on technology and policy issues, including sitting on Ontario's Open Government Engagement Team in 2014–2015.

In addition to working with government officials, David served as the first Director of Education for Code for America — training each cohort of fellows for their work with cities. David has also worked with 18F and the Presidential Innovation Fellows at the White House providing training and support.

Acknowledgements

This report has been possible with the sponsorship of digitalHKS, a project at Harvard Kennedy School that studies the intersection of digital technologies and governance. The authors are grateful for the generosity of the public servants who provided case studies. Additionally, many individuals provided invaluable feedback to earlier version of this report, including   Kadijatou Diallo, Tommaso Cariati, Westerley Gorayeb, Natasha Japanwala, Lauren Lombardo, Michael McKenna Miller, Nagela Nukuna, Emily Rapp, Naeha Rashid, Imara Salas, and Blanka Soulava. This report is an independent work product and views expressed are those of the authors.

About the Ash Center

The Roy and Lila Ash Center for Democratic Governance and Innovation advances excellence and innovation in governance and public policy through research, education, and public discussion. By training the very best leaders, developing powerful new ideas, and disseminating innovative solutions and institutional reforms, the Center’s goal is to meet the profound challenges facing the world’s citizens. The Ford Foundation is a founding donor of the Center. Additional information about the Ash Center is available at ash.harvard.edu.

This research paper is one in a series published by the Ash Center for Democratic Governance and Innovation at Harvard University’s John F. Kennedy School of Government. The views expressed in the Ash Center Policy Briefs Series are those of the author(s) and do not necessarily reflect those of the John F. Kennedy School of Government or of Harvard University. The papers in this series are intended to elicit feedback and to encourage debate on important public policy challenges.

This paper is copyrighted by the author(s). It cannot be reproduced or reused without permission. Pursuant to the Ash Center’s Open Access Policy, this paper is available to the public at ash.harvard.edu free of charge. 

Executive Summary

Properly used data can help city government improve the efficiency of its operations, save money, and provide better services. Used haphazardly, however, the use of analytics in cities may increase risks to citizens’ privacy, heighten cybersecurity threats, and even perpetuate inequities.

Given these complexities and potentials, many cities have begun to install analytics and data units, often head by a chief data officer, a new title for data-driven leaders in government. This report is aimed at practitioners who are thinking about making the choice to name their first CDO, start their first analytics team, or empower an existing group of individuals.

In the following pages, we provide city leaders with:

  • Definitions. To understand what an analytic team in city government looks like, as well as presenting alternative team structures in the realm of data, digital government, and innovation.
  • Principles. The attitudes and mindsets that are useful when trying to enact ambitious change in local government, such as ensuring executive buy-in and seeking allies outside of government.
  • Plays. The organizational, strategic, and tactical considerations that a city must think about for starting and supporting an analytics team. These are presented in a sequential order: how to set up a team for success, building up from scratch, managing the day-to-day, as well as important considerations (like privacy and ethics) that should be present throughout.

1. Introduction: The potential use and misuse of data analytics in city government 

Imagine a world where our cities not only deliver the services that we expect from them, from weekly trash collection to running our schools to supporting local businesses, but also tailor those services to each of its citizens. Like Netflix recommending the next show you should watch based on your previous views or Amazon steering you to a product you didn’t even know you needed, local governments could connect citizens who request a service with other social programs they qualify for.

Imagine a city hall not only reacting to problems — a student dropping out of school or a building catching on fire — when they occur, but using data to identify risks and take preventive measures before they happen.

These are not pipe dreams. Municipal departments already possess an abundance of data, from utility bills to tax records to school enrollment histories, though the value of that data is often locked away in silos. City analytics, a discipline that combines operations research and data science,has the potential to unlock the big insights behind the big data and move local governments toward higher efficiency and effectiveness in the provision of services and the prevention of social ills.

When it comes to the use of advanced analytics, the public sector needs to catch up to the progress seen in the private sector over the past few decades. But strictly copying private sector practices is not the right approach. There are considerations for dealing with data collected by a public institution. With all the potential analytics hold for improving the delivery of local services, its risks must also be carefully examined by analysts, the senior leaders who oversee their work, and civil society at large. A misuse of analytics may pose cybersecurity threats, threaten privacy, or reinforce existing inequities.

Lessons from NYC: Breaking silos increased the need for cybersecurity

To exemplify the risks of misusing analytics, let’s take the example of cybersecurity. A few years ago, New York City was considering improving its monitoring system for underground assets such as sewage pipes that were being tracked by several agencies, each with its own software system and database. Bringing all of those together into a unique system would have allowed for more efficient management. But while a breach of any particular databases would not be particularly catastrophic, having the city’s entire underground mapped by bad actors would be a major threat. The standards for how safely data is stored and shared internally had to be reexamined.

Doing analytics in government also comes with unique barriers, such as the difficulty of hiring individuals with the right technological background or moving an organizational culture that does not always welcome innovation.

But that doesn’t mean there is no path forward. In fact, an increasing number of cities are beginning to systematically incorporate analytics, often in the form of new teams headed by leaders with titles like chief data officer, chief analytics officer, or chief performance officer. This playbook examines successes and failures, and shows what hard choices have so far been made along the way. An analytics team must face difficult questions that do not have clear answers: Who do we hire first? How do we prioritize our projects? How much time should we spend communicating our successes? By illuminating some of the trade-offs cities have faced, we hope this document will help city officials find a strategy that works for their own context, limitations, and objectives.

In sports, it is important for a coach to understand the wide variety of plays that may come in handy in different circumstances. A playbook provides a sense of orientation in a context of high complexity. What should happen at the start of the game, what do we do when we are in overtime, etc. But just like in sports, understanding the generic plays is not enough to win the game. It’s all in the execution and flexibility in adapting to whatever resources are available.

The lessons in this playbook rely heavily upon Amen Ra Mashariki’s experiences as head of the Mayor’s Office of Data Analytics (MODA) in New York City, where he led a team of nine analysts and policy specialists. To illustrate some of the lessons throughout this document, we will call out these insights in boxes like the one on the previous page. But we have also put these ideas to the test by consulting some of the leading professionals in the field, who have generously provided case studies that illustrate the potential and barriers of urban analytics.

One thing readers should be wary of is the temptation to think that every problem can be solved with data and technology. Using analytics heedlessly may even make some problems worse. The intention of this report, therefore, is to allow for a critical application of sophisticated tools. We will address several dimensions of those tools including learning how to identify which problems are suited for an analytics approach, understanding potential issues, setting up the right team to tackle those issues, and making sure the team gets the right resources for success.

This document is structured as follows. We start by defining what an analytics team is and what it is not. Then we highlight principles that guide successful teams. Finally we will highlight specific plays: the key trade-offs, questions and frameworks that an analytics team in local government must consider in order to effectively pursue its mission.

This playbook won’t give you all the answers for starting an analytics team. Instead, it is are meant to orient you toward questions that allow you to connect the work you want to do with whatever is important and feasible right now. The skill sets of your employees, the legal framework you are operating under, and even the political cycle will mold your journey. We have tried to offer some thoughts and ideas, but ultimately you will have to decide which plays to focus on at each time. To get better at the game, you have to start somewhere.

2. Definitions

In their desire to move toward data-driven organizations, well-intentioned public managers may fail to fully grasp what they are getting themselves into. Not all organizations need an analytics team, but by providing a clear sense of what analytics teams are and what they are trying to accomplish, we hope to make it easier for senior officials to choose whether to pursue this path.

2.1. What an analytics team is

Analytics teams may take different names and vary in size, scope, and mission across different jurisdictions.[1] Our basic definition of an analytics team in local government is a specialized group that applies data analysis to uncover insights with the hope of improving the operations of city departments.

In practice, what we most often find is a team between one and 20 employees that has been formally integrated in the bureaucracy. This team will leverage data-intensive tools such as machine learning or mapping to define, evaluate, and propose solutions to public problems. Its mission statement is usually some variation of “improving the outcome of what the city does to create more public value for constituents at lower cost” — though the most pressing areas to tackle to deliver public value will change depending on where you are and whom you ask.

2.2. What an analytics team is not

Other data-driven management and digital-government adjacent groups can take these forms:

  • A traditional information technology (IT) team that maintains technology assets and keeps systems running
  • A performance management team that works across departments to define the relevant metrics for success
  • A digital service team tthat ensures interoperability of the data, such as allowing members of one department to easily access data from other departments
  • An open data team that maintains an online portal aimed at sharing city data sets with the general public
  • A data governance team that sets citywide standards and convenes analysts across the city to build a citywide data culture

Note that these functions can be complementary to the work of an analytics team. In some places there may be overlap (for example, a chief data officer may set open data policies), and cities may choose to prioritize some of these over others.

3. Principles

While no single blueprint can work for every city, you will find throughout this report some common principles that underlie any effort to bring more analytical thinking to local governments. Teams should:

  • Commit to better service delivery. An analytics team must be driven by a desire to make government work better for its citizens in a transparent and accountable way.
  • Be both the disrupter and the listener. Given the nascent state of analytics in government, analytics teams may have to act like internal entrepreneurs who question the way things are done. But this desire for disruption must be paired with a profound curiosity to understand the nuances and challenges that led to the status quo.
  • Start with the problem, not the solution. An analytics team should avoid the temptation to run around with fancy tools in search of places to apply them. Instead, its strategic outlook and tactical moves should be centered around pressing challenges faced by the city or its individual departments. And when a need is found, a sense of practicality and viability should supercede the drive for sophistication. Simple is better than complex.
  • Ensure executive buy-in. As talented and persuasive as data-driven public servants may be, support from key stakeholders at the top is essential. Whatever your specific institutional arrangement, buy-in from your mayor, her chief of staff, the city manager, or someone sufficiently high up will allow you to overcome the resistance to change you’ll encounter along the way.
  • Work in the open. The work should not be top secret. The projects undertaken should be considered part of a conversation about public service delivery that is appropriately communicated within city hall and with the public. Blog posts can help explain the aim of each project and code can be posted online for scrutiny. A a commitment to transparency should ensure that those seeking information about a process or its results can easily access it.
  • Find allies. You will require allies both inside and outside the organization. In this playbook we will discuss in detail ways you can engage with department heads to create quick wins, when to extend ties to academia, and how the press can come in handy.
  • Quickly deliver value, but think long term. The team will need to rapidly demonstrate what it is bringing to the table and deliver quick wins. However, team members must also be aware that to create even more value they need to think about long-term investments in infrastructure and changes to the work culture. An effective manager must carefully consider both time frames simultaneously. 

4. The plays

 

4.1. Setting up the right team for success

This section begins by looking at the most relevant part of your analytics effort: the people behind it and how they should be organized.

4.1.1 Step zero: Making the case for data analytics

How do you argue the need for an analytics team?

Before beginning to establish an analytics team, stakeholders may need to be persuaded. Whether it is a city administrator, a councilwoman, or the general public, a convincing case must be made that a new team will help the city reach its goals.

Jane Wiseman, CEO of the Boston-based management consulting firm Institute for Excellence in Government, in studying cities investing in analytic efforts,[2] points toward multiple types of value that can be attributed to data teams:

  • Fraud detection cost savings
  • Efficiency improvements that reduce costs
  • Accuracy improvements that reduce costs
  • Increased revenue capture
  • Efficiency improvements that improve outcomes
  • Operational changes that increase safety
  • Increased faith in government due to more transparency

Naturally, not all stakeholders will care equally for every one of these. A city administrator may be more concerned with cutting costs than with transparency, for example. It is important to tailor the message in a way that considers particular stakeholder interests.

On the issue of improving city finances, a McKinsey study[3] on the use of data analytics in government to find fraud found that the return of investment could be as high as 15 times the cost. Wiseman also points toward two municipal data teams that self-reported their return on investment (ROI): The Louisville Metro Government calculated a five-to-one return for analytics efforts and the Cincinnati Office of Performance and Data Analytics calculated a $6.1 million added value to the city in two years of operation, a nine-to-one return.

4.1.2 Defining structure: Decentralized or centralized?

Should the analytics team be in one department or agency or spread out among several?

An analytics team may be a single central office that takes on projects across the city (centralized model) or a collection of employees  who are spread among various departments (decentralized model).

To decide between the two models, a useful rule of thumb is to think about what types of projects would generate the most immediate value for senior sponsors.

If you want to use data to tackle an issue that is handled by a single department (e.g., crime prevention, which is the responsibility of the police department), then a decentralized model, with analysts that sit permanently in that department, will allow for more sophisticated and in-depth research.

On the other hand, if your aim is to address a challenge where different agencies share responsibility (e.g, you want to know which buildings to target for inspection in a city that splits code enforcement tasks among the fire department, housing, etc.), then you should establish a centralized, dedicated unit. Having analysts who work with multiple departments will avoid task duplication and also reap the benefits of breaking down data silos instead of having each department create from scratch its own models with partial data.

In “Data-Driven Government: The Role of Chief Data Officers,”[4] published by the IBM Center for the Business of Government, Wiseman offers a systematic comparison of the centralized and decentralized models for government agencies, as seen in the following table.

 

 

Centralized

Decentralized

 

Chief data officer (CDO) focus

  • Analytics resources in a single team under the CDO’s leadership
  • CDO team provides data and analytics services to key executives and managers, functioning as an internal consultant
  • Team works in partnership with executives and managers to define scope and project needs
  • Some bureaus or agencies — typically the better resourced or statistics-driven ones — may have their own analytics resources, but the majority rely on the centralized CDO team
  • CDO team creates distributed capacity across government by embedding talent in bureaus through training and coaching
  • CDO team creates tools, platforms, and data standards that speed adoption of data skills in bureaus
  • Each bureau/agency/office is responsible for developing its own analytics capability, which can range from budget and policy analysts who can complete basic descriptive statistics and dashboards to analysts with the skill to perform data science tasks such as predictive analytics

 

Ideal for

  • Specialized skills or subject matter expertise needed for highly technical work
  • Analytics projects that require a high degree of confidentiality, such as investigations
  • Large-scale enterprises with similar or low-complexity operations spread across many bureaus, divisions or geographies
  • Agencies with a high level of existing data skill or broad adoption of data literacy, such as scientific or statistical agencies

 

Benefits

  • Centralized pool of analytics talent allows sharing of specialized skills across the enterprise from a common hub, which saves money since highly trained analytics staff can be expensive for government
  • Efficiencies are gained via peer support and collaboration among team members
  • Centralized team is better able to standardize tools and processes across government, which can save time and money and help develop deeper expertise in the chosen tools and methods
  • Team can facilitate cross-organizational data initiatives due to its enterprise-wide view of data assets and needs
  • Leaders and managers in bureaus have more control over their analytics resources, may get more timely responses to their requests, and may also more immediately deploy analytics insights
  • Analysts embedded in bureaus develop subject matter expertise that makes them valuable to their leadership and speeds time to results
  • Embedded analysts can foster greater adoption of data culture across enterprise, which can lead to faster organizational culture change
  • Skills gained in self-service analytics are transferable across government, spreading benefit

 

Limitations

  • Slow growth of sustainable analytics talent in the bureaus
  • Can be challenging to achieve scale with a small centralized team, as surge capacity may need to be deployed for a high-priority task
  • Putting decision-making and control of analytics in the hands of bureau heads leads to uneven attention and results, with some investing heavily and others giving it low priority or not appointing an analytics officer unless compelled to do so
  • Decentralized model limits peer cohorts for data-focused employees and may results in a more limited career ladder

 

 

 

 

 

Case study: Chicago’s CDO has a centralized mandate[5]

The first U.S. city to have a chief data officer (CDO) was Chicago. After briefly being part of the mayor’s office, the position was then moved to the Department of Innovation and Technology.

Tom Schenk Jr. served as the city’s second CDO and as deputy director of IT, which gave him both a centralized mandate to drive data analytics and the operational responsibility for maintaining and upgrading the city’s databases and digital platforms.

This centralization of responsibilities allowed Schenk and his team not only to push for concrete data analytics projects, but also to take control of data governance. They began building a data inventory of the entire city and optimizing the data infrastructure for easier analytics in the future.

Being able to inventorize the data was essential for setting up future successes. However, Schenk warns that quick wins are an absolute necessity. “Stuff will get hard and you will need to ask a lot of people [for help],” he says. “An inventory of data will take a lot of time and effort, and both residents and internal stakeholders will not perceive its value until you show some progress and real, tangible results.”

Schenk and his team also had to consider how to choose which projects to spend time on. To decide, the team worked closely with the University of Chicago, creating a framework to screen projects and assess the data maturity required for meaningful analysis in conjunction with the master of science program in Computational Analysis & Public Policy.[6]

A longer write-up of this case can be found in the annex.

 

4.1.3 Who to hire

Who is the first person you hire for your analytics team? What about the second, third, and fourth?

There is no predetermined profile or career progression for the team members in an analytics office. One reason for that is the need for teams that cover a wide variety of tasks, from project management and performing complex analyses to knowing how to build effective cooperation with the domain experts and more.

An article published in the Harvard Business Review in 2019 noted that data science teams in the private sector often attempt to discover “profound new business capabilities,” and because of this such teams must be made up of generalists who are focused on learning and iteration rather than specialists who do only one thing efficiently.[7] This argument can be made even more strongly for local governments because of the immense quantity and complexity of the services they provide.

For your first team member, consider looking for someone within the organization who has some local experience, rather than hiring out of a prestigious economic-analysis consultancy or data science firm. Even if this is a job that will require a good understanding of analytical methods, the first employee’s main challenge at first will be effectively navigating the tacit idiosyncracies of a political and organizational environment. An urban planner at the transportation department, for example, may have better insight into both the current challenges the city is facing and whom to partner with to get stuff done. Keep your eyes open.

When considering further hires, around analytics offices across the world, it is common to find professions as diverse as:

  • Data scientists
  • Urban planners
  • GIS managers and analysts
  • Political scientists
  • Social science researchers/analysts
  • Mathematicians and statisticians

Some skill sets that will be valuable regardless of previous occupation will be empathy (to understand the challenges of others), communication (to work effectively in collaborative teams), and curiosity (to always be looking out for status quo thinking that needs to be challenged).

Keep in mind that cultural diversity adds to the strength of the team (especially if it mirrors the diversity in your city) and will help avoid analytic projects with blindspots in issues such as race, differences among neighborhoods, etc. For more on this, see the passage on algorithmic bias in section 4.4.2, “The pitfalls of analytics.”

4.1.4 How to hire

How do you overcome some of the challenges in attracting talented and motivated individuals to local government?

Once you start looking outside the organization, it will not be easy to hire who you want. Young, ambitious professionals well-versed in analytical skills may feel more at home in a small start-up than in a big, clunky bureaucracy. A McKinsey article about trends in the workforce[8] highlighted the importance millennials put on flexibility, the availability of mentor relationships, and the autonomy to tackle their own projects — qualities not usually associated with the public sector.

Moreover, there is the issue of money; usually the public sector has strict rules about how much to pay people according to their place in the organization chart, and the average starting salaries for data scientists is often higher than what most agency heads earn — so highly skilled professionals will be offered significantly less than what they could earn elsewhere.

One advantage that you do have is the ability to offer impact and purpose. You can structure your portfolio of projects in a way that gives each analyst the opportunity to work on complex urban challenges that may affect the lives of thousands of citizens, while bringing forth new methodologies and ideas. Team members may also have access to high-level individuals in the city. But this also requires you to create a workflow with the proper autonomy for everyone on your team.

Local universities and colleges are an important resource. Whether you are located in a small city or a large metropolis, there should be a college nearby with a data science program. A few universities even offer specialized programs, such as the University of Chicago’s Master of Science in Computational Analysis and Public Policy[9] and New York University’s Urban Analytics track for its Master of Urban Planning.[10] At the very least, your local educational institution should have a statistics program. If searching locally bears no fruit, you might look into specialized networks for connecting the public sector with academia such as the MetroLab Network[11] and the Data Science for Social Good Fellowship at Carnegie Mellon University.[12]

Lessons from NYC: Accepting a faster changing team

At times, the traditional model of structuring a team in local government may need to be reconsidered. In New York City, Amen knew that if he hired young people he would not be able to retain them for long. So he kept a team with high turn-over and a frantic schedule of projects, offering his team the ability to work with autonomy on high-stakes issues.

           

 

Effectively engaging educational institutions means giving them access to interesting data sets and projects. For professors and students, getting their hands on real data for analysis is incredibly enticing. Graduate students must often do capstone projects with real-world clients in order to graduate. Finally, you may use this as an opportunity to create a pipeline for talent, offering student internships that may turn into full-time jobs by graduation. Regardless of the specific approach you take, hiring should be a continuous task. You may have to be constantly looking at resumes, nurturing your relationships with academia, going to classes to meet with students, looking for interns, etc.

4.2. Starting from scratch

Once in place, your analytics team will need to get started quickly. This will mean sorting out difficulties and finding a way to demonstrate value quickly. This section goes in depth into those crucial first steps.

4.2.1 How to find your first project

 How do you spot a “minimum viable product” ?

Your first analytics project is important. It will provide an opportunity to prove the value of the team to leadership and to senior officials throughout the city, so you should start not with a solution in mind but with a real problem that the city is facing — ideally, a challenge that has been clearly identified as a priority by leaders.

To further build the case for the analytics team, it should be a problem that can be tackled by integrating data sets from across multiple agencies. This will show the value of breaking data silos and will probably uncover problems of poor data collection. Furthermore, problems where the data is scattered are often complex policy issues that require deeper thinking — something that may have been ignored so far.

Finally, this cannot be an intellectual exercise only. Although there is plenty of value in visualizing a problem that may have been hidden from the view of stakeholders, to truly cement your first project as a success you should work with your partners toward a clear strategy for delivery. Do not assume that this will happen on its own. Talk with partnering departments beforehand to clarify how analytical results can help with the implementation or piloting of operational innovations.

From the very beginning, identify partners across the entire city government who see the worth of your work. They may be senior officials or new employees. Regardless, make them your data champions. Empower them. But do not forget to communicate to the top; the senior advisor or chief of staff to the mayor may be very busy in her day-to-day, but she should still be frequently updated on what you are doing.

 

Case study: Piloting London’s Office of Data Analytics[13]

In 2016 a pilot began for the London Office of Data Analytics (LODA). It was orchestrated by the Greater London Authority; 12 of the 36 London boroughs; Nesta, an innovation NGO; and ASI, a data science firm.

LODA found its first project by asking for suggestions from the participating boroughs. Then, a workshop session was conducted to collaboratively assess the projects based on:

  • Money saving potential
  • Availability of data
  • The ability to produce insights and delivering results within two months
  • The ability to solve the problem without personal data sets

After the issue selection process, the pilot focused on one use case: leveraging predictive analytics to identify multiple occupations in houses that did not have the appropriate licenses. By combining data sets from multiple sources, the team sought to point inspectors toward infractors.

Ultimately the project was unable to produce meaningful insights, but it did help inform a protocol for data-sharing among the boroughs.

A longer write-up of this case can be found in the annex.

 

4.2.2 Making the case for more funding

How do you argue for more money?

Every budget appropriation process is different, as the financial, political, and technical idiosyncrasies play out. However, the more clarity there is in the analytics team’s mission, the easier it will be to iterate that mission and construct a compelling narrative. It helps if the mission is closely aligned with the intentions of senior leaders. Often, analytics teams get stuck trying to do a little bit of everything, which muddles the argument for recurring funding.

You should be aware of the downside to following performance indicators, though, since important outcomes are often not measured. Performance indicators will usually point you toward things that are already being done by each department, potentially blinding you to exploring solutions that may not fit neatly within the existing bureaucratic structure. On the other hand, focusing on improving upon existing solutions may make the potential implementation more straightforward. This is a tension that you will have to navigate constantly.

Once you have finished your first project you may choose to scale your work in either vertically, by growing the complexity of the analysis you provide to your partners (i.e., moving from describing trends in data sets to predictive work to prioritize resources), or horizontally, by growing the number of agencies with which you are collaborating.

Lessons from NYC: Look at performance indicators

A logical alternative would be to make sure the projects that the team implements are tied to performance indicators. For example, when Amen was leading the New York City team, they only worked on projects that were part of the yearly quantitative goals that had to be reported to the mayor’s office and to the city at large. If a department had a proposition, the analysis had to have a logical and explicit way to move the numbers toward a goal. This discipline made it easier for the analytics team to track its impact in terms of city priorities met and dollars saved, thus making the case for growth.

 

As the analytics team begins to find a comfortable place in the organization, it should consider how to transition into a sustainable workflow that delivers the most value. This section helps to illustrate what you can expect the team’s day-to-day to look like. What sort of projects can the team offer? How does it prioritize projects? How should other departments be brought into the fold? And, how do you shift from merely reacting to thinking about the long term?

 

4.3.1 Understanding the multiple uses of analytics: Repertoire of actions

What can you use analytics for?

Once an analytics team is in place and has one or two projects under its belt, it should begin to look for other ways it could add value to ongoing projects, and how to let other departments understand what services the team can offer to help them do their work.

The table below is adapted from work performed by New York City’s Mayor’s Office of Data Analytics, and includes the categories of data analysis projects and real-use cases for a variety of city needs.

 

 

Why would we want to do it?

Example application

Prioritizing

Where to go first?

Ranking a list according to certain criteria can enable more efficient use of resources. Useful when [getting to the worst things]? earlier can mitigate potential negative effects.

To assist the Department of Education’s work to make all schools ADA-compliant, MODA used DOE data to prioritize which schools to renovate in order to reduce the number of students with disabilities who needed to use buses 

 

Scenario Analysis

What if?

Considering alternative events and their possible outcomes can help policymakers find the best course of action and plan for a greater range of possible policies.

As part of the Mayor’s Office of Long Term Planning and Sustainability’s research for a new commercial composting policy, MODA predicted how much waste local businesses would generate under various  regulatory thresholds.

Anomaly Detection

What is out of the ordinary?

Some processes can be improved by identifying and investigating outliers. Useful when looking for the exception is more feasible than examining every case.

Registration records of all kinds may have a number of files that display unusual characteristics. Flagging and examining those records may reveal procedural oversight or fraudulent transactions.

Matching

What goes with what?

Matching can optimally pair two groups against a certain set of constraints. Useful for equitably distributing limited resources.

When the appointment scheduler for IDNYC was backlogged with duplicate requests, MODA helped match applicants to times and locations based on indicated preferences.

Estimating

How much will a project cost?

Projects can be planned more effectively when time, materials, and costs are estimated in advance. Useful for quantifying the costs and benefits of new programs.

MODA worked with the Department of Housing Preservation and Development to estimate the resource requirements and program outcomes for a new set of Enhanced Contractor Review procedures.

Targeting

Where to look?

Targeting can narrow an operational domain to enable better resource allocation. Useful for identifying a subset for a specific intervention.

MODA created a model to help identify buildings that have displayed a pattern of unsafe living conditions. This enabled the Tenant Harassment Prevention Task Force to follow up with inspections and enforcement actions when necessary.

 

4.3.2 Picking the right projects

How do you choose what to work on?

At some point, you will have to choose between multiple projects that are asking for the analytics team’s resources and attention. Look for projects with the right:

  • Partners: Do you have buy-in from stakeholders to try new solutions and put insights into practice?
  • Data: Is there meaningful data that could lead to insights?
  • Impact: Will the analysis help illuminate a solution for a problem that is relevant?

Chicago’s approach was codified in a form given to technical departments that requested help from the analytics team,[14] allowing users to quickly assess a number of alternatives. Not every city necessarily needs to develop its own questionnaire, as the day-to-day of project intake may be more messy, but there should be some sense of which projects advance the analytics team’s mission and the overall goals of the administration.

Look for performance metrics. If, prior to the collaboration, there is some measurement of whatever problem is being solved or service is being improved, that will help make the case for the value of the team in the form of dollars saved, extra customers served, or whatever metric is relevant.

Consider scope. Some projects may be multi-year collaborations on complex policy areas, while others may have quicker turnarounds. You may want a combination of both in your repertoire.

Finally, mayoral priorities must be a consideration. Most mayors and city administrators will have a strategic plan or list of goals. These are a good start for finding projects that have a mandate to innovate and the appropriate resources for implementation.

4.3.3 Working with other departments

How can the analytics team be an effective partner to others?

By definition, a city’s analytics team is a partner to other departments.

This is important to keep in mind because it should dictate the attitude taken when relating to internal stakeholders. After all, if anything goes wrong, it’s the department that’s directly implementing a service that usually gets blamed, not the analyst. Avoid the negative spotlight. Don’t assume a project that is fascinating to you will be of interest to whichever commissioner or project director you want to pitch it to. Understand their priorities and avoid politically sensitive subjects when you don’t have the appropriate cover.

Instead, when approaching potential partners actively listen to their goals and concerns. When you do have an idea to bring up, make sure to frame it in a way that is attuned to their interests: “We would love to sit down and discuss how data could help you handle this problem or achieve this target.”

Another good practice in early exploratory meetings is to provide a variety of ways of helping. For example, you could:

  • Provide advisory functions upon request
  • Help research a particularly thorny question through data
  • Offer training to people in the department if they have to use data or a specific software tool in their day-to-day functions
  • Assist in developing a procurement strategy for tools that would best serve their need and allow for better analysis

Finally, remind your partners that you will be in the background and will do whatever it takes to make sure they get recognized for the success of the project.

4.3.4 Data stewardship

How do you think about data management?

Eventually, the team will need to start thinking beyond building effective collaborations that lead to fruitful projects and toward setting the right data infrastructure, or how data is managed, organized, and governed. This is known as data stewardship, and it may be vital to the long-term success of your analytics team.

Some cities, like Boston, have installed centralized data warehouses built in collaboration with contractors and maintained internally (see case study below). Such a tool may or may not be the best alternative for your city; what’s most relevant is the concerted and iterative effort to rethink the city’s data infrastructure, and how that can be connected to the mission of the analytics team.

 

Case study: Boston’s implementation of a centralized data warehouse[15]

The City of Boston has implemented its own centralized data warehouse, which serves as a repository for different databases that can be used throughout several departments. It was built over a period of three years with a contractor hired through a competitive bidding process. The project started small, encompassing a handful of databases, automating the loading of data into the system as much as possible, and expanding from there. Today, more than 30 departments have their data up and running.

Boston decided to make this investment in data infrastructure primarily for two reasons. First, to create a central repository (a single source of “truth”) and avoid the duplication of data across several departments. Second, to shift the work of data analysts from tracking down and cleaning data to adding value to the data by, for example, understanding the operational implications behind each project or ensuring the robustness of the analysis.

Boston’s warehouse has been instrumental in implementing one of the city’s priorities: its Vision Zero for eliminating fatal and serious car crashes, [16] which combines data sets from multiple stakeholders.

A longer write-up of this case can be found in the annex.

 

4.3.5 Pushing projects to production

How can you turn insights into value?

In whatever realm the analytics team in your city may hope to make a difference, it should ultimately strive to not only create insightful analysis, but also to develop products or tools that are useful for other employees of the city and that (either incrementally or radically) change their day-to-day operations. This requires more resources and a larger team, but it is the key to delivering impact and innovation.

In trying to address this challenge, the boss at Transport for London’s data team (see case study below) uses philosophies that are not always common in local government: agile development — a way of organizing work around iteration and quick prototypes — and the introduction of product management to oversee the continuous and iterative improvement of the applications that generate business value, as opposed to project management where projects have start and end dates.

Case study: Transport for London and Leveraging Data Products[17]

As the chief data officer of Transport for London (TfL), Lauren Sager Weinstein heads a team of 70 people, including data scientists, product managers, software developers, and data architects. The team is relatively new in the organization, and part of its mission is to centralize the creation of tools that use the vast amount of data generated by London’s transportation network — from traffic information to traffic-signals data to costumer data — while preventing the creep of siloed data tools that traditionally didn’t interact with one another.

TfL’s analytics team is particularly concerned in creating organizational change, since if there is no connection to actual operations that will be affected, then there is no real business value. To do this, they have adopted strategies such as:

  • A  product management mindset, where product managers continue to work closely with operational experts who use the results of an analytics project
  • An agile methodology to their projects, which includes the creation of minimal viable products with defined outcome metrics
  • A commitment to transparency, privacy, and clear communication of their work and its value to the general public

A longer write-up of this case can be found in the annex.

4.3.6 Branding and communications

How should you communicate the analytic team efforts?

You need to be proactive about communicating the work of the team.

One reason is that an effective branding strategy will help you with the other challenges we have pointed out: attracting the right talent to the team, building meaningful partnerships with other departments, and securing appropriate resources.

Another reason is that there may be (often well-founded) skepticism and challenges to using methods such as predictive analytics in government. If these fears are not addressed they may paralyze the work of the team.

The most appropriate strategy for the analytics team will likely depend on a combination of the organizational choices and the current state of the conversation. It will require different approaches in different situations; depending, for example, on whether you are working on concrete operational issues or pursuing one of the mayor’s priorities, which is likely to have a marketing strategy of its own.

In any case, be prepared to explain in simple terms what you are trying to accomplish, while also having considered the implications of your work (see Section 4.4.2, “The pitfalls of analytics”).

As much as possible, make your data, code, and insights publicly available. Academic institutions will probably be willing to partner up and further analyze your results, which will help you spread the information to an even wider audience as well as validate your approach.

4.4. Important considerations

In this section we include other key concepts that will be vital for the analytics team. Although we offer only a brief explanation for each of these challenging concepts, we believe that the people behind analytics in government should keep open discussions surrounding how to remake government from a platform perspective, the importance of open data and sharing, the ethical considerations behind this work, and, above all, the understanding that when poorly managed, these projects can actively cause harm.

4.4.1 Open data

How can you use open data requirements as an advantage?

Across many jurisdictions, there is an increasing legal or political mandate for open data policies.[18] In New York City, for example, legislation approved in 2012 demands that all departments upload their data using open data standards. The push has often been led by civil society with the aim of increasing transparency and predicated upon established principles such as accessibility, timeliness. and completeness.[19] Governments may also have an interest in publishing their data as it may lead to civic innovations.

The call for open data within city hall, whether as a formal decree or an informal demand, can be a great way to speed up the potential for analytical work by giving you more data sets to play around with and build the data infrastructure of the city. But you should consider some caveats.

  • Not all mayors and city managers will be equally receptive (or pressured) to push open data. Therefore, the willingness of departments to comply may vary. Understand the history of your city’s open data efforts, the current landscape, and the applicable laws and regulations to open data, and adapt accordingly.
  • Even if there is a clear mandate, you should watch out for potential duplication. If there is a legal requirement, there may be an office separate from the analytics team that has been set up to help departments with open data compliance. Thus, certain departments may have to send around the data multiple times.
  • Adding data to an open data portal is costly. It will probably involve an active process of data cleaning to scrub away any personally identifiable information.
  • You will have to do some thinking around your users. Most open data efforts assume that there is a standard generic user and they tailor their efforts to whatever is most convenient to the city.

Lessons from NYC: Considering multiple consumers of open data

In reality, you may have different types of users, with different skill sets and different understanding of what data is useful or not. When Amen was in MODA, he commissioned the social impact firm Reboot to do a study that found at least six personas or types of users who were using, or could potentially use, their open data portal — mappers, liaisons, interpreters, explorers, bystanders, and community champions.

 

Several companies offer prepackaged open data solutions that may be cheaper than building your own portal from scratch, but one thing to consider before committing to one is whether that solution has the right users in mind. Another alternative may be inviting citizens to give input into the city’s open data policies; it’s more expensive but you will reap the benefits of a co-created solution that engages the community.

4.4.2 The pitfalls of analytics: Privacy, security, and algorithmic bias

What should you, the team, and city leadership watch out for?

As governments start to ingest ever-growing quantities of data, new pitfalls appear. The more granular, rich, and timely your data, the more it will grow in usefulness for analytics, and the higher the risk of potential breach or misuse. Entirely new literatures have emerged over the past decade surrounding the challenges of privacy, security, and bias in algorithms, and universities have created graduate programs for professionals to specialize in these areas. Here we only offer a glimpse of the general things to consider.

Privacy

There is an inherent tension between transparency and privacy, even when you are careful with personally identifiable information. As more and more information is made available in open data portals, concerns arise that something like data on parcels can be linked to individual behaviors. And even if you attempt to anonymize or de-identify certain data sets, studies show that information can be reidentified with a lot more ease than was once thought.[20]

It is important to listen, respond to the concerns of citizens, and ensure that protocols are put in place in accordance with national, state, and local laws. Understand too that agencies that produce and control data may have their own concerns. Different departments may have different privacy requirements and if you are trying to bring that information into a centralized system, you should be aware of all the limitations.

Cybersecurity

In recent years the number of cyber attacks targeting state and local governments has grown considerably in the United States and around the world. The role of the chief information security officer (CISO) has been getting more attention.[21]

This presents two relevant considerations for the analytics team. First, it should understand that bringing information together under a centralized data infrastructure may create additional vulnerabilities. In the United States, some data types (such as medical information) have specific standards for security that must be met.

Second, the team needs to understand who has the mandate to deliver cybersecurity, which depends on the city’s structure. In some jurisdictions this mandate may not even be realized yet and may need to be created. If there is an IT team that is separate from the analytics team and they has been charged with maintaining data security, that team should be a close partner in helping the analytics team plan the next step for the city’s data infrastructure.

Algorithmic bias

Many academic and journalistic articles have been published in the past couple of years regarding the fairness of algorithmic decision-making. Some worry that black box algorithms built on faulty data may perpetuate or even accentuate patterns of inequality already in place. Others retort that with the right precautions and oversight, data can lead to a fairer distribution of resources by being quicker to identify those who need them or by predicting where an intervention can have the largest impact.

Some local governments are taking steps. New York City enacted an ordinance to monitor its automated systems,[22] and states like Vermont are developing guidelines.[23]

While there are no simple solutions at the moment, a good starting place is transparency. Having a scrutable open-source library of every project and auditing any automated decision-making system in the city should help create some trust. One useful resource to consider is the Open Data Institute’s Data Ethics Canvas,[24] which encourages analytics teams to be thoughtful about the primary purpose of any project and consider those who may be negatively affected by it.

5. Summary: Data analyst, go out there and listen

In the summer of 2015, cooling towers were killing people in New York City. Legionnaires’ disease is a form of bacterial pneumonia spread through the inhalation of infected water vapors, and several sites in the South Bronx saw enough cases in July and August that it became the biggest outbreak of the disease in the city’s history. Public health officials mandated the inspection and disinfection of all cooling towers in the area, but there was no official record of where such towers were located.

            MODA was called to help.[25] There had been a dozen fatalities and the city was scrambling to identify and begin tracking all the existing towers. Many departments tried to get as much information as possible in a variety of ways, from creating a self-registration portal to dialing building managers to physically inspecting buildings, resulting in disparate data sets with no “ground truth.”

By quickly pulling together all the information gathered, MODA was able to create a single, reliable data set for the inspections to work from as well as predictive models to find the missing towers. An early machine-learning model had enough predictive power to find approximately 90 percent of all cooling towers in New York. This allowed the city to move quickly, stopping the spread of the disease in a matter of weeks.

Of course, not every project is so dramatic. Some projects may seem more mundane. Some may never be implemented. But they are all important. As MODA’s support in the Legionnaires’ disease crisis illustrates, adding more analytical capacity is increasingly important as cities must adapt to the growing complexity of whatever urban challenges they are facing, and ultimately provide value to citizens. Being able to find previously unseen value in your data sets, ensure that city operations are tied to increases in performance, prove the value of new policies with data, and understand the limitations and pitfalls of these approaches so that they are used responsibly will inevitably go from being seen by city leadership as nice-to-have to must-have.

6. Annex

6.1. Case study: Chicago’s CDO has a centralized mandate[26]

Tom Schenk Jr. became the second Chief Data Officer (CDO) to serve in Chicago, after having played roles in the private, public, and academic sectors and publishing a book on data visualization. His arrival came just after the department had just been moved from the mayor’s office to the Department of Information Technology (DoIT), a department with highly qualified staff but a lot of attrition, which had operational responsiblity for maintaining and upgrading every single database and major digital platform used by other departments. The appointment gave Schenk dual roles as both CDO and deputy director of IT.

Dual roles that converge

Because he had to understand the ins and outs of the systems that were serving the rest of City Hall, these dual responsibilities provided a tactical advantage. Tom and his team optimized their systems while also molding them to be more responsive to the needs of future analytics teams. This reduced the barriers between analysts and IT professionals, and whenever a database was hard to access, Tom could easily get his team to give access and solve the issue.

Being able to inventorize the data was essential for setting up future successes. However, Schenk warns that quick wins are an absolute necessity. “Stuff will get hard and you will need to ask a lot of people [for help],” he says. “An inventory of data will take a lot of time and effort, and both residents and internal stakeholders will not perceive its value until you show some progress and real, tangible results.”

As CDO, Schenk undertook a broad variety of projects, from predictive analytics to open data to automation. One of his most notable projects was a predictive algorithm that identified where kids were most likely to suffer from lead poisoning before they were 1 year old.

Selecting projects as a centralized analytics unit

As a centralized data analytics unit, how to choose which projects to spend time on was a constant question. The team worked closely with the University of Chicago. In conjunction with the Master of Science in Computational Analysis & Public Policy, they created a framework to screen projects and assess the data maturity required for meaningful analysis.[27] Although the criteria were very explicit, the project intake remained flexible as the team evaluated the availability of data, the potential impact, and the ability for the partnering unit to operationalize the problem, or what they would do after the analysis was conducted.

Effective projects through collaboration

Another publicized initiative was a predictive model for food inspection evaluations.[28] This model, which aimed to optimize the procedure for inspecting restaurants and resulted in critical violations being found seven days earlier, sprouted as part of a grant awarded by Bloomberg Philanthropies to Chicago as part of the Mayor’s Challenge.[29] It later became a collaborative partnership between the Chicago Department of Innovation and Technology (DoIT), the Department of Public Health (CDPH), the Civic Consulting Alliance, and Allstate Insurance.

Schenk also spearheaded the creation of a robust open data policy and hosted hackathons to promote the use of city data and get the community engaged.

An example of the difficulties of collaboration came from one project that intended to predict where illegal cigarettes sales occurred. Because of the nature of the issue, the relevant data were collected by both the county and the city, and getting different levels of government (which have different data collection and standardization practices) onboard required significant coordination. Because the project was not able to get to concrete results early on, it eventually got sidetracked and discarded.

6.2. Case study: Piloting London’s Office of Data Analytics[30]

In 2016 a pilot began for the London Office of Data Analytics (LODA). It was orchestrated by the Greater London Authority; 12 of the 36 London boroughs; Nesta, an innovation NGO; and ASI, a data science firm.

Although it was modeled after New York City’s Mayor’s Office of Data Analytics, LODA faced an additional layer of complexity because each of London’s Boroughs had its own council, government structure, and data sets without any obvious standardization.

LODA found its first project by asking for suggestions from the participating boroughs. Then, a workshop session was conducted to collaboratively assess the projects based on:

  • Money saving potential
  • Availability of data
  • The ability to produce insights and delivering results within two months
  • The ability to solve the problem without personal data sets

Data to tackle common challenges

After a multistage project selection process, the pilot focused on one use case: leveraging  predictive analytics to identify multiple occupations in houses that did not have the appropriate licenses. By combining data sets from multiple sources, the team sought to point inspectors towards infractors.

One of the main questions the pilot sought to answer was whether this methodology would be easily scalable to all boroughs of the city or to other policy areas. As an ultimate goal, this provided the opportunity to intelligently design shared services and coordinate the action of different teams.

Data scientists from an external consultancy were brought onboard, as the data analytical capacity inside the boroughs was extremely limited and their analysts could not commit full time to this project. Internal and external analysts worked together to identify relatable data sets from building inspections, noise complaints, tax bands, etc.

Learning from failure

The LODA pilot failed to produce meaningful results or a scalable methodology.

One of the main barriers was that the boroughs collected widely varying data, and when they did collect the same data, they used different formats. This forced the team to create individual models for each borough. Much time had to be spent processing, cleaning, and merging the data. Moreover, the data sets weren’t properly geomatched; that is,there wasn’t a unique identifier for each property that allowed the merging of data.

From a data science standpoint, it was hard to carry out a predictive model because the variable of interest was only half-labelled; the team knew for certain when some houses fell in the “unlicensed multiple occupation” category, but didn’t know for certain when a house was definitely not in that category. This made assessing the accuracy of the model difficult.

Other challenges cited by the pilot group included:

  • Data quality: significant effort was devoted to cleaning; inability to geomatch certain data sets
  • Data availability: private rental data were missing or hard to access
  • Data warehousing: some boroughs did not have centralized business intelligence units or data warehouses and the data had to be pulled individually from different sections of the organization
  • Rarity of the predicted variable: The variable of interest was too rare in certain boroughs, which made the construction of a predictive model hard
  • Lack of capacity: Lack of available in-house expertise in the boroughs to work in both the data analytical portion and the implementation that should have followed

After the pilot, an information sharing protocol was signed by 12 of the boroughs and Mayor Sadiq Khan announced the creation of a City Data Analytics Programme to build on the connections made possible by the data partnership between boroughs.

6.3. Case study: Boston’s implementation of a centralized data warehouse[31]

The City of Boston has implemented its own centralized data warehouse, which serves as a repository for different databases that can be used throughout several departments. It was built over a period of three years with a contractor hired through a competitive bidding process. The project started small, encompassing a handful of databases, automating the loading of data into the system as much as possible, and expanding from there. Today, more than 30 departments have their data up and running.

Investing in data infrastructure

Why is Boston investing in its data infrastructure? On one hand, the city avoids conflicting reports of data by establishing a single repository of reliable information, which creates trust. Before establishing a centralized warehouse, several versions of the same data may have existed in different departments. For example, if the 311 phone line (which in many cities is the main aggregator for citizen requests and complaints) gets reports about potholes in the streets, it may keep a separate list and then pass it on to the Department of Transportation, which is in charge of filling the potholes. The 311 division and DOT may keep separate data sets, perhaps with different attributes. (Has the complaint been inspected? Has the citizen been contacted when his case is closed?) If you wanted to understand the entirety of the pothole operational performance, you would have to track down several different data sets held by different people in different departments, perhaps even in different formats.

This helps clarify a second advantage of a centralized warehouse: as you move toward enacting advanced analytics — whether by building dashboards, running predictive models, etc. —  a centralized model will save your analysts will considerable time and effort and their results will be more reliable. Because they don’t have to spend tracking down and cleaning the data, steps that could have taken weeks or months turn into days, and the analytics team can spend its time on activities that really add value, such as working with partners to understand the operational implications behind the data or ensuring the robustness of the analysis.

Boston’s warehouse has been instrumental in implementing one of the city’s priorities: its Vision Zero for eliminating fatal and serious car crashes.[32] This plan, which involves several departments (transportation, public works, police, and more), is built upon reporting, dashboards, and geospatial visualizations that rely upon data that is also collected by several stakeholders. By leveraging the data warehouse, the analytics team at the Department of Innovation and Technology was able to quickly and frequently provide information regarding traffic patterns, interventions at streets, reported incidents, and more.

Maria Borisova, one of the city’s software engineers who has been overseeing the warehouse implementation from its inception tells us that the technical details are relevant but not that hard to figure out. It is often working with other partners that requires thoughtfulness. Plenty of times Borisova’s counterparts at other city departments may have concerns about centralizing their data, worrying about the  accessibility, reliability, and safety of the new system. Patience, starting small and building iteratively, and showing value to your counterparts is vital, she says.

6.4. Case study: Transport for London and leveraging data products[33]

As the chief data officer of Transport for London (TfL), Lauren Sager Weinstein heads a team of 70 people, including data scientists, product managers, software developers and data architects. The team is relatively new in the organization, and part of its mission is to centralize the creation of tools that use the vast amount of data generated by London’s transportation network — from traffic information to traffic-signal data to costumer’s data — while preventing the creep of siloed data tools that didn’t interact with each other.

Creating real business value through analytics

The data science team at TfL seeks not only to understand data but also to create meaningful products that add business value to the organization. This requires an understanding of TfL’s strategic priorities,[34] such as expanding bus service to outer London and reducing carbon emissions, as well as of the operational complexity of the network. While keeping constant communication with other divisions of TfL, a data scientist may find an insight that could be used to improve more processes. After a proof of concept to test its usefulness, a team of developers may built a software tool around it and a dashboard to measure its outcomes. One of the several product managers may be in charge of ensuring that the tool continues to fulfill its goal and hit its targets.

To accomplish this, the team must work with an agile methodology, building minimal viable products with defined outcome metrics. It is essential that any partner(s) for a given project should have a clear articulation of what analysis they need and what it will be used for. Without a connection to actual operations that will be affected, there is no real business value. For example, to execute the city’s Vision Zero,[35] which seeks to eradicate road deaths by 2030, the transit division wanted to understand where bus speeding was most frequent to then take the right preventive measures.

Another of the team’s recent undertakings has been a pilot to test whether WiFi usability data (collected at tube stations) could help the organization’s understanding of traffic patterns and help create measures to avoid congestion, improve operations, and prioritize investments. With data coming from over 500 million connection requests, the team prototyped a series of solutions, including the display of approximate train congestion for passengers waiting on the station and others. By using an agile approach, TfL has been able to test both the technical and business value feasibility, and can now begin to consider moving to a production version of some of the ideas that were tested.

It is also worth noting TfL’s commitment to transparency — the results of pilots are published online —[36] and clear policies regarding privacy and the use of personal information[37].

 

[1] Jane Wiseman. (2017). Lessons from Leading CDOs: A Framework for Better Civic Analytics

[3] Susan Cunningham, Mark McMillan, Sara O’Rourke, and Eric Schweikert, “Cracking down on government fraud with data analytics,” October 2018, McKinsey & Company.

[5] Interview with Tom Schenk (October 2019).

[15] Interview with Maria Borisova, Data Engineering Manager at the City of Boston (February 2020).

[17]  Interview with Laura Sager Weinstein, Chief Data Officer at Transport for London (March, 2020).

[26] Interview with Tom Schenk (October, 2019).

[31] Interview with Maria Borisova, Data Engineering Manager at the City of Boston (February, 2020).

[33]  Interview with Laura Sager Weinstein, Chief Data Officer at Transport for London (March, 2020).

Last updated on 10/06/2020

Federal COVID‐19 Response Funding for Tribal Governments: Lessons from the CARES Act

Citation:

Henson, Eric, Miriam R. Jorgensen, Joseph Kalt, and Megan Hill. 2020. “Federal COVID‐19 Response Funding for Tribal Governments: Lessons from the CARES Act”.

Abstract:

Eric C. Henson, Megan M. Hill, Miriam R. Jorgensen & Joseph P. Kalt; July 2020 

The federal response to the COVID‐19 pandemic has played out in varied ways over the past several months.  For Native nations, the CARES Act (i.e., the Coronavirus Aid, Relief, and Economic Security Act) has been the most prominent component of this response to date. Title V of the Act earmarked $8 billion for tribes and was allocated in two rounds, with many disbursements taking place in May and June of this year.

This federal response has been critical for many tribes because of the lower socio‐economic starting points for their community members as compared to non‐Indians. Even before the pandemic, the average income of a reservation‐resident Native American household was barely half that of the average U.S. household. Low average incomes, chronically high unemployment rates, and dilapidated or non‐existent infrastructure are persistent challenges for tribal communities and tribal leaders. Layering extremely high coronavirus incidence rates (and the effective closure of many tribal nations’ entire economies) on top of these already challenging circumstances presented tribal governments with a host of new concerns. In other words, at the same time tribal governments’ primary resources were decimated (i.e., the earnings of tribal governmental gaming and non‐gaming enterprises dried up), the demands on tribes increased. They needed these resources to fight the pandemic and to continue to meet the needs of tribal citizens.

Read the full report

Last updated on 07/24/2020

Emerging Stronger than Before: Guidelines for the Federal Role in American Indian and Alaska Native Tribes’ Recovery from the COVID‐19 Pandemic

Abstract:

Eric C. Henson, Megan M. Hill, Miriam R. Jorgensen & Joseph P. Kalt; July 2020 

In this policy brief, we offer guidelines for federal policy reform that can fulfill the United States’ trust responsibility to tribes, adhere to the deepest principles of self‐governance upon which the country is founded, respect and build the governing capacities of tribes, and in the process, enable tribal nations to emerge from this pandemic stronger than they were before. We believe that the most‐needed federal actions are an expansion of tribal control over tribal affairs and territories and increased funding for key investments in tribal communities. 

Read the full report

Last updated on 07/24/2020

Lift Every Voice: The Urgency of Universal Civic Duty Voting

Citation:

Rapoport, Miles, E.J. Dionne, and et al. 2020. “Lift Every Voice: The Urgency of Universal Civic Duty Voting.” Ash Center for Democratic Governance and Innovation, Brookings Governance Program .
Lift Every Voice: The Urgency of Universal Civic Duty Voting

Abstract:

The Universal Voting Working Group, July 2020 

Imagine an American democracy remade by its citizens in the very image of its promise, a society where the election system is designed to allow citizens to perform their most basic civic duty with ease. Imagine that all could vote without obstruction or suppression. Imagine Americans who now solemnly accept their responsibilities to sit on juries and to defend our country in a time of war taking their obligations to the work of self-government just as seriously. Imagine elections in which 80 percent or more of our people cast their ballots—broad participation in our great democratic undertaking by citizens of every race, heritage and class, by those with strongly-held ideological beliefs, and those with more moderate or less settled views. And imagine how all of this could instill confidence in our capacity for common action.

This report is offered with these aspirations in mind and is rooted in the history of American movements to expand voting rights. Our purpose is to propose universal civic duty voting as an indispensable and transformative step toward full electoral participation. Our nation’s current crisis of governance has focused unprecedented public attention on intolerable inequities and demands that Americans think boldly and consider reforms that until now seemed beyond our reach.

Read the full report

Last updated on 07/22/2020

Our Path to “New Normal” in Employment? Sobering Clues from China and Recovery Scores for U.S. Industry

Citation:

Cunningham, Edward, and Phillip Jordan. 2020. “Our Path to “New Normal” in Employment? Sobering Clues from China and Recovery Scores for U.S. Industry.” Ash Center for Democratic Governance and Innovation.

Abstract:

Edward Cunningham and Philip Jordan, July 2020 

The US National jobs reports for May and June exceeded expectations, and for many, this signaled that April was the true peak of American job losses and real recovery may be underway. Yet mounting evidence suggests that a job recovery is a long way off and that many jobs may not return.

Part of the analytic disconnect stems from the fact that the global pandemic is a novel challenge for policymakers and analysts. We lack current, useful benchmarks for estimating the damage to the labor market, for estimating what recovery would look like, and for measuring an eventual recovery in jobs. Given this paucity of models, one place to look for patterns of potential recovery – particularly relating to consumption and mobility – is China.

The Chinese economy is driven largely by consumption, urban job creation is driven by small and medium-sized companies, and China is several months ahead of the US in dealing with the pandemic’s economic and labor impact. An analysis of China’s experience may, therefore, offer important clues about our recovery here at home, and inform new models of thinking about American job recovery.

Read the full report

Last updated on 07/15/2020

Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time

Citation:

Cunningham, Edward, Tony Saich, and Jessie Turiel. 2020. Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time. Ash Center for Democratic Governance and Innovation.
Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time

Abstract:

Edward Cunningham, Tony Saich, and Jessie Turiel, July 2020

This policy brief reviews the findings of the longest-running independent effort to track Chinese citizen satisfaction of government performance. China today is the world’s second largest economy and the Chinese Communist Party (CCP) has ruled for some seventy years. Yet long-term, publicly-available, and nationally-representative surveys in mainland China are so rare that it is difficult to know how ordinary Chinese citizens feel about their government.

We find that first, since the start of the survey in 2003, Chinese citizen satisfaction with government has increased virtually across the board. From the impact of broad national policies to the conduct of local town officials, Chinese citizens rate the government as more capable and effective than ever before. Interestingly, more marginalized groups in poorer, inland regions are actually comparatively more likely to report increases in satisfaction. Second, the attitudes of Chinese citizens appear to respond (both positively and negatively) to real changes in their material well-being, which suggests that support could be undermined by the twin challenges of declining economic growth and a deteriorating natural environment.

While the CCP is seemingly under no imminent threat of popular upheaval, it cannot take the support of its people for granted. Although state censorship and propaganda are widespread, our survey reveals that citizen perceptions of governmental performance respond most to real, measurable changes in individuals’ material well-being. For government leaders, this is a double-edged sword, as citizens who have grown accustomed to increases in living standards will expect such improvements to continue, and citizens who praise government officials for effective policies may indeed blame them when such policy failures affect them or their family members directly. While our survey reinforces narratives of CCP resilience, our data also point to specific areas in which citizen satisfaction could decline in today’s era of slowing economic growth and continued environmental degradation.

Read the full report

Last updated on 07/08/2020

Policy Prototyping for the Future of Work

Citation:

Gustetic, Jenn, Carlos Teixeira, Becca Carroll, Joanne Cheung, Susan O’Malley, and Megan Brewster. 2020. “Policy Prototyping for the Future of Work”.

Abstract:

Jenn Gustetic, Carlos Teixeira, Becca Carroll, Joanne Cheung, Susan O'Malley, and Megan Brewster; June 2020

The future of work will require massive re-skilling of the American workforce for which current policy “toolboxes” for economics, labor, technology, workforce development and education are often siloed and antiquated. To meet the needs of tomorrow’s workers, today’s policy makers must grapple with these interdisciplinary policy issues.

This report describes a novel design-driven approach we developed to create policy “prototype” solutions that are inherently interdisciplinary, human-centered, and inclusive for the future of work. Using our design-driven approach, we collaborated with more than 40 interdisciplinary and cross-sector thinkers and doers to generate 8 distinct policy prototypes to support the future of work.

Read the full report

Fiscal Strategies to Help Cities Recover—And Prosper

Citation:

Goldsmith, Stephen, and Charles “Skip” Stitt. 2020. “Fiscal Strategies to Help Cities Recover—And Prosper.” Ash Center for Democratic Governance and Innovation.
Fiscal Strategies to Help Cities Recover—And Prosper

Abstract:

Stephen Goldsmith, May 2020 

Despite robust economies, many local officials entered 2020 already worried about budget balances that looked fragile in the short term and problematic in the long term due to enormous pension and health-care issues. Today, in the wake of COVID-19, clearly federal support is necessary, but it is also apparent that it cannot alleviate all the pressures on communities as responsibilities related to the pandemic skyrocket while revenues plummet.

While many public managers will rightly deploy a host of tactical cost-cutting measures, the most creative among them will explore deeper and more strategic changes, such as those presented herein, which will help address the current crisis while preparing their cities for the future. This paper suggests a transition to a culture deeply focused on data, incentives for city workers to produce internal reforms, public-private partnerships that monetize operational excellence, and rapid adoption of both new technologies and good ideas borrowed from other jurisdictions. These more deliberate and strategic approaches may be harder to implement but those offered here need not harm incumbent public employees nor negatively impact cities’ efforts to ensure access and equity. Rather, the strategies we outline should strengthen the efficiency and mandates of existing government offices while helping make cities more resilient and better prepared for tomorrow’s challenges.

Read the full report

Last updated on 06/23/2020

Union Impact on Voter Participation—And How to Expand It

Union Impact on Voter Participation—And How to Expand It

Abstract:

Tova Wang, May 2020

Some politicians have enacted measures in recent years to make voting harder and to reduce participation among certain groups. Others have sought to counteract that voter suppression by implementing laws to make voting easier, such as same-day or automatic registration. There is another antidote to the effort to reduce participation: lifting up worker organizations. This is especially important to understand given the ways in which powerful individuals and groups have sought to weaken unions because of their political strength representing American workers.

In this report, the author first explains efforts to weaken unions and the voice of working people; then what the decline of unions and union membership has meant for participation; next, Wang looks at the data showing the positive effects unions have on voter participation; and finally, she suggests how going forward we can reform the laws and how labor is structured such that it not only continues to facilitate voter participation, but even enhances it.

Read the full report

Last updated on 06/30/2020

2019 State of Digital Transformation

Citation:

Eaves, David, and Georges Clement. 2020. “2019 State of Digital Transformation”.
2019 State of Digital Transformation

Abstract:

David Eaves, Georges Clement; May, 2020

In June of 2019, the Harvard Kennedy School hosted digital service teams from around the world for our annual State of Digital Transformation convening. Over two days, practitioners and academics shared stories of success, discussed challenges, and debated strategy around the opportunities and risks digital technologies present to governments.

Teams that joined us for the summit used different approaches and methodologies in vastly different contexts. Some governments—such as those of Estonia and Bangladesh—were building on decade or more of experience refining already-advanced practices; others—such as the state of Colorado’s—were still getting ready to formally launch. Some had deep connections across their entire executive branch; others were tightly focused within a single agency.

Despite these differences, many key themes emerged throughout the convening. This paper contains reflections from the Summit. 

Read the full report

Dissecting the US Treasury Department’s Round 1 Allocations of CARES Act COVID‐19 Relief Funding for Tribal Governments

Abstract:

Randall K.Q. Akee, Eric C. Henson, Miriam R. Jorgensen, and Joseph P. Kalt; May 2020 

This study dissects the US Department of the Treasury’s formula for distributing first-round CARES Act funds to Indian Country. The Department has indicated that its formula is intended to allocate relief funds based on tribes’ populations, but the research team behind this report finds that Treasury has employed a population data series that produces arbitrary and capricious “over-” and “under-representations” of tribes’ enrolled citizens.

Read the full report

Last updated on 05/18/2020

Crisis Communications for COVID-19

Citation:

Leonard, Herman B. "Dutch", Arnold M. Howitt, and David Giles. 2020. “Crisis Communications for COVID-19”.

Abstract:

Herman "Dutch" Leonard, Arnold Howitt, and David Giles; April 2020

Communication with employees, customers, investors, constituents, and other stakeholders can contribute decisively to the successful navigation of a crisis.  But how should leaders think about what they are trying to say – and how to say it?

This policy brief lays out simple frameworks that can be used to formulate the messages that leaders can and should – indeed, must – convey to help their communities and organizations make their way forward as effectively as they reasonably can.

Read the full report

Last updated on 04/28/2020

Crisis Management for Leaders Coping with COVID-19

Citation:

Leonard, Herman B. "Dutch", Arnold M. Howitt, and David W. Giles. 2020. “Crisis Management for Leaders Coping with COVID-19”.

Abstract:

Herman "Dutch" Leonard, Arnold Howitt, and David Giles; April 2020

In the face of the rapidly evolving coronavirus crisis that demands many urgent decisions but provides few clear-cut cues and requires tradeoffs among many critically important values, how can leaders and their advisers make effective decisions about literally life-and-death matters?  This policy brief contrasts the current “crisis” environment with the more familiar realm of “routine emergencies.” It argues that for crises, leaders need to adopt a more agile, highly adaptive, yet deliberate decision-making method that can move expeditiously to action, while retaining the capacity to iteratively re-examine tactics in light of decision impacts. This method can help the team take account of the multiple dimensions of the COVID-19 crisis and cope as well as possible with swiftly changing conditions.

Read the full report

Last updated on 10/19/2020

China's Most Generous: Examining Trends in Contemporary Chinese Philanthropy

China's Most Generous: Examining Trends in Contemporary Chinese Philanthropy

Abstract:

Edward Cunningham and Yunxin Li, March 2020 

This annual report highlights leading results from the most recent data analysis of the Harvard Kennedy School Ash Center’s China Philanthropy Project, capturing over one-quarter of estimated national giving in China. We focus on elite giving by building an annual database of the top 100 individual donors, top 100 donors from corporations and other organizations, and also top university recipients of philanthropic giving.

In 2018, such Chinese giving:

  • was dominated by large organizations (most commonly corporations) rather than individuals,  
  • supported in large part central government policy priorities in the area of poverty alleviation,
  • revealed an intriguing new philanthropy-driven educational model in the country, and
  • remained fairly local in scope.

Read the report in Chinese 

Read the full English report

Last updated on 05/04/2020

China’s Role in Promoting Transboundary Resource Management in the Greater Mekong Basin (GMB)

Abstract:

Malcolm McPherson, March 2020 

This paper examines how China can improve transboundary resource management within the Greater Mekong Basin (GMB) through its participation in the Lancang-Mekong Cooperation (LMC). Such improvement would ensure the efficient management and equitable development of the basin’s natural resources and ecosystems.

Read the full report

Prioritizing Public Value in the Changing Mobility Landscape

Citation:

Goldsmith, Stephen, and Betsy Gardner. 2020. “Prioritizing Public Value in the Changing Mobility Landscape”.
Prioritizing Public Value in the Changing Mobility Landscape

Abstract:

Stephen Goldsmith and Betsy Gardner, January 2020

In this paper we will look at the values and goals cities affect with policies concerning connected mobility, and how to create a new framework that aligns with these objectives. First, we identify the transformative changes affecting cities and mobility. Second, we discuss in more detail the guiding values and goals that cities have around mobility with examples of these values in practice. Our next paper, Effectively Managing Connected Mobility Marketplaces, discusses the different regulatory approaches that cities can leverage to achieve these goals.

We recommend that cities identify various public values, such as Equity or Sustainability, and use these to shape their transit policy. Rather than segmenting the rapidly changing mobility space, cities should take advantage of the interconnectivity of issues like curb space management, air quality, and e-commerce delivery to guide public policy. Cities must establish a new system to meet the challenges and opportunities of this new landscape, one that is centered around common values, prioritizes resident needs, and is informed by community engagement.

In conclusion, cities must use specific public values lenses when planning and evaluating all the different facets of mobility. Transportation has entered a new phase, and we believe that cities should move forward with values- and community-driven policies that frame changing mobility as an opportunity to amend and improve previous transportation policies.

This paper is the first in the Mobility in the Connected City series.

Read the second paper "Effectively Managing Connected Mobility Marketplaces" 

Read the full report

Last updated on 02/05/2020

Effectively Managing Connected Mobility Marketplaces

Citation:

Goldsmith, Stephen, and Mathew Leger. 2020. “Effectively Managing Connected Mobility Marketplaces”.
Effectively Managing Connected Mobility Marketplaces

Abstract:

Stephen Goldsmith and Matt Leger, February 2020

As new innovations in mobility have entered the marketplace, local government leaders have struggled to adapt their regulatory framework to adequately address new challenges or the needs of the consumers of these new services. The good news is that the technology driving this rapid change also provides the means for regulating it: real-time data. It is the responsibility of cities to establish rules and incentives that ensure proper behavior on the part of mobility providers while steering service delivery towards creating better public outcomes. Cities must use the levers at their disposal to ensure an equitable mobility marketplace and utilize real-time data sharing to enforce compliance. These include investing in and leveraging physical and digital infrastructure, regulating and licensing business conducted in public space, establishing and enforcing rules around public safety, rethinking zoning and land use planning to be transit-oriented, and regulating the digital realm to protect data integrity.

This paper is the second in the Mobility in the Connected City series. 

Read the first paper  "Prioritizing Public Value in the Changing Mobility Landscape"

Read the full report

Last updated on 02/05/2020

Science, Technology, & Democracy: Building a Modern Congressional Technology Assessment Office

Science, Technology, & Democracy: Building a Modern Congressional Technology Assessment Office

Abstract:

Zach Graves and Daniel Schuman, January 2020

This paper offers recommendations and a road map for the future success of a restarted technology assessment office in Congress. We look at three potential approaches: (1) Building up the Government Accountability Office (GAO)’s OTA-like capacity in its newly created Science, Technology Assessment, and Analytics (STAA) team, and giving it greater resources and structural autonomy; (2) Reviving OTA but updating its procedures and statutory authority; and (3) A hybrid approach wherein both GAO and a new OTA develop different capacities and specializations. (Spoiler: we favor the third approach.)
 
The next section of this paper reviews what OTA was and how it functioned. The third section discusses the history of and rationale for the defunding of OTA, other cuts to Congress’s S&T capacity, and why this congressional capacity and expertise matter for democracy. The fourth section reviews efforts to revive OTA and other efforts to build new congressional S&T capacity. The fifth section discusses the political landscape for building S&T capacity, including the legislative branch appropriations process and the different political constituencies for S&T. The final section offers a detailed discussion of various structural recommendations for a new congressional technology assessment office, including an expanded STAA unit in GAO, and a new OTA.
 

Read the full report

Last updated on 01/27/2020

Hong Kong: The Rise and Fall of “One Country, Two Systems”

Abstract:

William H. Overholt, December 2019

This is an extensively edited, updated and expanded text of a lecture given for the Mossavar-Rahmani Center for Business and Government at Harvard Kennedy School on October 31, 2019. From the origination of “one country, two systems” in 1979 to today, this paper analyzes the history of the unique relationship between Hong Kong, Beijing, and the world.

Read full paper

Last updated on 03/31/2020

Playbook: Government as Platform

Abstract:

Richard Pope, November 2019

Looking around the world, we can see a different approach to digital government. One of cross-government platforms that are beginning to break down organizational silos, save money and change the types of services that can be delivered to the public. This playbook is written for practitioners, from public sector product managers to chief digital officers, looking for approaches to implementing platforms in government. 

Read full paper

Last updated on 01/24/2020
  •  
  • 1 of 7
  • »

Lift Every Voice: The Urgency of Universal Civic Duty Voting

Citation:

Rapoport, Miles, E.J. Dionne, and et al. 2020. “Lift Every Voice: The Urgency of Universal Civic Duty Voting.” Ash Center for Democratic Governance and Innovation, Brookings Governance Program .
Lift Every Voice: The Urgency of Universal Civic Duty Voting

Abstract:

The Universal Voting Working Group, July 2020 

Imagine an American democracy remade by its citizens in the very image of its promise, a society where the election system is designed to allow citizens to perform their most basic civic duty with ease. Imagine that all could vote without obstruction or suppression. Imagine Americans who now solemnly accept their responsibilities to sit on juries and to defend our country in a time of war taking their obligations to the work of self-government just as seriously. Imagine elections in which 80 percent or more of our people cast their ballots—broad participation in our great democratic undertaking by citizens of every race, heritage and class, by those with strongly-held ideological beliefs, and those with more moderate or less settled views. And imagine how all of this could instill confidence in our capacity for common action.

This report is offered with these aspirations in mind and is rooted in the history of American movements to expand voting rights. Our purpose is to propose universal civic duty voting as an indispensable and transformative step toward full electoral participation. Our nation’s current crisis of governance has focused unprecedented public attention on intolerable inequities and demands that Americans think boldly and consider reforms that until now seemed beyond our reach.

Read the full report

Last updated on 07/22/2020

Union Impact on Voter Participation—And How to Expand It

Union Impact on Voter Participation—And How to Expand It

Abstract:

Tova Wang, May 2020

Some politicians have enacted measures in recent years to make voting harder and to reduce participation among certain groups. Others have sought to counteract that voter suppression by implementing laws to make voting easier, such as same-day or automatic registration. There is another antidote to the effort to reduce participation: lifting up worker organizations. This is especially important to understand given the ways in which powerful individuals and groups have sought to weaken unions because of their political strength representing American workers.

In this report, the author first explains efforts to weaken unions and the voice of working people; then what the decline of unions and union membership has meant for participation; next, Wang looks at the data showing the positive effects unions have on voter participation; and finally, she suggests how going forward we can reform the laws and how labor is structured such that it not only continues to facilitate voter participation, but even enhances it.

Read the full report

Last updated on 06/30/2020

Science, Technology, & Democracy: Building a Modern Congressional Technology Assessment Office

Science, Technology, & Democracy: Building a Modern Congressional Technology Assessment Office

Abstract:

Zach Graves and Daniel Schuman, January 2020

This paper offers recommendations and a road map for the future success of a restarted technology assessment office in Congress. We look at three potential approaches: (1) Building up the Government Accountability Office (GAO)’s OTA-like capacity in its newly created Science, Technology Assessment, and Analytics (STAA) team, and giving it greater resources and structural autonomy; (2) Reviving OTA but updating its procedures and statutory authority; and (3) A hybrid approach wherein both GAO and a new OTA develop different capacities and specializations. (Spoiler: we favor the third approach.)
 
The next section of this paper reviews what OTA was and how it functioned. The third section discusses the history of and rationale for the defunding of OTA, other cuts to Congress’s S&T capacity, and why this congressional capacity and expertise matter for democracy. The fourth section reviews efforts to revive OTA and other efforts to build new congressional S&T capacity. The fifth section discusses the political landscape for building S&T capacity, including the legislative branch appropriations process and the different political constituencies for S&T. The final section offers a detailed discussion of various structural recommendations for a new congressional technology assessment office, including an expanded STAA unit in GAO, and a new OTA.
 

Read the full report

Last updated on 01/27/2020

The Arizona Independent Redistricting Commission: One State's Model for Reform

Citation:

Mathis, Colleen, Daniel Moskowitz, and Benjamin Schneer. 2019. “The Arizona Independent Redistricting Commission: One State's Model for Reform”.

Abstract:

Colleen Mathis, Daniel Moskowitz, and Benjamin Schneer; September 2019 

In most states, redistricting, the process by which electoral district boundaries are drawn, is an overtly partisan exercise controlled by state legislatures. The U.S. Supreme Court’s 2019 decision Rucho v. Common Cause held that federal courts cannot review allegations of partisan gerrymandering. Independent redistricting in practice has proven remarkably successful along several dimensions. This policy brief outlines key lessons learned from redistricting in Arizona, a state with a five-person independent redistricting commission.

Read full paper

Last updated on 01/24/2020

Civic Responsibility: The Power of Companies to Increase Voter Turnout

Civic Responsibility: The Power of Companies to Increase Voter Turnout

Abstract:

Sofia Gross and Ashley Spillane, June 2019 

This case study provides an analysis and evaluation of the implementation of civic participation programs by companies aimed at increasing voter turnout. The United States consistently lags behind the majority of developed democratic nations in voter turnout, averaging less than half of the eligible voter population participating in midterm elections. The U.S. ranks 26th out of 32 developed democracies in percentage of eligible voters who participate in elections. Today, many companies have dedicated resources for corporate social responsibility projects aimed at strengthening society and building goodwill among employees, consumers, and the public. Voter participation initiatives align with the goals of social responsibility projects, as they address a critical societal problem (lack of engagement), while building goodwill with key stakeholders. 

Read full paper

Last updated on 10/13/2020

Can Transparency and Accountability Programs Improve Health? Experimental Evidence from Indonesia and Tanzania

Abstract:

Transparency for Development Team, June 2019 

This paper assess the impact of a transparency and accountability program designed to improve maternal and newborn health (MNH) outcomes in Indonesia and Tanzania. Co-designed with local partner organizations to be community-led and non-prescriptive, the program sought to encourage community participation to address local barriers in access to high quality care for pregnant women and infants. This paper evaluates the impact of this program through randomized controlled trials (RCTs), involving 100 treatment and 100 control communities in each country, and finds that on average, this program did not have a statistically significant impact on the use or content of maternal and newborn health services, nor the sense of civic efficacy or civic participation among recent mothers in the communities who were offered it.

Read full paper

Last updated on 01/24/2020

Europe Is Us: Brexit Will Not Take Place

Abstract:

Muriel Rouyer, May 2019 

The saga of Brexit, an elusive public policy with shifting objectives but devastating costs, confirms an unpleasant reality: economic interdependence keeps majoritarian will, even that of a sovereign people, in check. Brexit raises the question, fundamental in democracy, of political freedom, which itself calls into question the political community within which freely agreed-upon choices are made.

Read full paper

Engaging Patients for Research That Matters: IBD Partners

Abstract:

Elena Fagotto, Project on Transparency and Technology for Better Health, March 2019

The Project on Transparency and Technology for Better Health was established to conduct comparative case studies on platforms that empower patients through information to provide an inventory and typology of initiatives. This case study takes a look at IBD Partners, a research network connecting nearly 15,500 IBD patients with over 300 researchers. Patients can contribute their self-reported health data for research by filling out surveys on their health twice a year. This way, patient-generated data feeds into an extensive database that can be accessed by researchers to conduct longitudinal studies, to connect with patients for clinical trials and for prospective studies. Patients can also use the platform to suggest research questions and vote for the most interesting ideas, generating a truly patient-driven research agenda.

Read full paper

The Power of Peer-to-Peer Connections: Breast Cancer Straight Talk Support Facebook Community

Abstract:

Elena Fagotto, Project on Transparency and Technology for Better Health, March 2019

The Project on Transparency and Technology for Better Health was established to conduct comparative case studies on platforms that empower patients through information to provide an inventory and typology of initiatives. This case study takes a look at Breast Cancer Straight Talk Support, a closed Facebook community for women dealing with breast cancer and survivors. With hundreds of posts every day, the group is a safe space where women can vent about feeling scared, depressed, or lonely and receive support from women who “get them.” For many members, the group is a window into other women’s cancer journeys, which gives them perspective and a more proactive attitude to fight the disease. The community is also an important resource to ask questions on treatments, side effects, surgery and more.

Read full paper

Exchanging Information to Create a Learning Health System: The ImproveCareNow Approach to Engagement

Abstract:

Elena Fagotto, Transparency and Technology for Better Health, March 2019

The Project on Transparency and Technology for Better Health was established to conduct comparative case studies on platforms that empower patients through information to provide an inventory and typology of initiatives. This case study details ImproveCareNow (ICN), a network of clinicians, medical centers, patients, families and researchers working together to improve the lives of children with inflammatory bowel disease (IBD). 

Read full paper

Local Governance and Access to Urban Services in Asia

Abstract:

Shabbir Cheema, November 2018

This policy brief explores how democratic processes in local governance affect access to urban services in Asian cities, especially for marginalized groups. It is based on research conducted by a group of national research and training institutions in nine cities in five Asian countries as well as regional dialogue hosted and facilitated by East-West Center with the support of the Swedish International Center for Local Democracy (ICLD). Governance process variables investigated were local government resources and capacity; mechanisms for local participation, accountability, and coordination; use of information and communications technology (ICT); implementation and replication of good practices; and management of peri-urbanization. This brief outlines research findings that are applicable across countries at the city level.

Read full paper

American Democracy: For Whom Does the Death Knell Toll?

Abstract:

Muriel Rouyer, August 2018 

American liberal democracy, once a model throughout the world, is in crisis. The most obvious symptom of this malaise is a paradoxical attitude that pervades an underprivileged section of the population that, against its own interests, supports the ruling plutocrats. How can we explain this?

Read full paper

Proceedings -- Getting to Eighty Percent: A Symposium Advancing Voter Participation

Abstract:

May 2018 

At the core of the work of the Ash Center and the Kennedy School is the effort to understand how citizens and institutions come together to make democracy work, and rarely before has the importance of this effort been more evident. Underlying the deceptively simple idea of making democracy work are a number of large themes: protecting the fundamental norms of democracy and democratic processes from challenges both in the United States and internationally; encouraging innovation in governance and public accountability; preventing the massive inequalities of our economic system from permeating our democracy and threatening its existence.

Indeed, one essential element of “making democracy work” in the United States is to have as close to full and inclusive participation of the people who comprise our democracy as we possibly can. The name of our May 3rd symposium, “Getting to 80
percent,” was chosen with intent; while a goal of 80 percent participation is achievable, it will require a real stretch—not tinkering around the edges of the current system, but instead pursuing a major set of innovative ideas and practices.

Read the proceedings

Building a Democracy Machine: Toward an Integrated and Empowered Form of Civic Engagement

Abstract:

John Gastil, June 2016 

In this paper, John Gastil calls for designers, reformers, and government sponsors to join together to build an integrated online commons, which links together the best existing tools by making them components in a larger “Democracy Machine.” Gastil sketches out design principles and features that would enable this platform to draw new people into the civic sphere, encourage more sustained and deliberative engagement, and send ongoing feedback to both government and citizens to improve how the public interfaces with the public sector.

Read full paper

The Political Economy of Transparency: What Makes Disclosure Policies Effective?

Citation:

Fung, Archon, David Weil, Mary Graham, and Elena Fagotto. 2004. “The Political Economy of Transparency: What Makes Disclosure Policies Effective?”.

Abstract:

Archon Fung, David Weil, Mary Graham and Elena Fagotto, December 2004 

Transparency systems have emerged in recent years as a mainstream regulatory tool, an important development in social policy. Transparency systems are government mandates that require corporations or other organizations to provide the public with factual information about their products and practices. Such systems have a wide range of regulatory purposes which include protecting investors, improving public health and safety, reducing pollution, minimizing corruption and improving public services.

Read the full report

Transit Transparency: Effective Disclosure through Open Data

Abstract:

Francisca M. Rojas, June 2012 

The public disclosure of transit information by agencies is a successful case of open data adoption in the United States. Transit transparency offers insights into the elements that enable effective disclosure and delivery of digital information to the public in cases where there is a strong demand for that information, and where the disclosed information is available at the right place and time for users to act upon.

Read the full report

Disruptive Logic: A New Paradigm For Social Change

Abstract:

Tim Burke and Gigi Georges, December 2011

As the U.S. grapples with fiscal crisis – facing spiraling deficits, dangerous levels of debt, and the worst economic recession in some 70 years – Americans understand that all levels of their government must take action. Calls are growing louder from across the political spectrum for the same spirit of cost-cutting and financial restraint within government that so many families have had to embrace. According to a Pew Research Center poll in early 2011, however, even while Americans increasingly recognize the need to halt increases in spending, many remain reluctant to embrace specific cuts. There is still not one area of domestic federal spending – whether education, veterans' benefits, health care or public safety – that more Americans, when pressed, want to decrease more than they want to increase.

Read Full Paper

Last updated on 01/30/2020

From Government 2.0 to Society 2.0: Pathways to Engagement, Collaboration, and Transformation

Abstract:

Archon Fung and Zachary Tumin, October 2011

In June 2010, 25 leaders of government and industry convened to Harvard University to assess the move to ”Government 2.0” to date; to share insight to its limits and possibilities, as well as its enablers and obstacles; and to assess the road ahead. This is a report of that meeting, made possible by a grant from Microsoft.

Read Full Paper

Innovations in Post-Conflict Transitions: The United Nations Development Program in the Democratic Republic of the Congo

Abstract:

Sarah Dix, Diego Miranda, and Charles H. Norchi, February 2010

Between January and September of 2007, a team composed of Dr. Sarah Dix, Mr. Diego Miranda, and Dr. Charles H. Norchi appraised the United Nations Development Program (UNDP) in the Democratic Republic of the Congo (DRC) country office programs, procedures, and management as implemented from 2003 to 2007. During the 2003 to 2007 period, the country program cycle focused on promoting good governance, conflict prevention, community recovery, and fighting HIV/AIDS, malaria, and tuberculosis. Overall, the office managed more than $500 million for all programs, becoming among the three largest UNDP country operations in the world. This report examines the organizational dimensions of the UNDP office in the DRC, and analyzes its most important program innovations.

Read Full Paper

Open Government and Open Society

Citation:

Fung, Archon, and David Weil. 2010. “Open Government and Open Society”.

Abstract:

Archon Fung and David Weil, February 2010

Enthusiasts of transparency should be aware of two major pitfalls that may mar this achievement. The first is that government transparency, though driven by progressive impulses, may draw excessive attention to government's mistakes and so have the consequence of reinforcing a conservative image of government as incompetent and corrupt. The second is that all this energy devoted to making open government comes at the expense of leaving the operations of large private sector organizations – banks, manufacturers, health providers, food producers, drug companies, and the like – opaque and secret. In the major industrialized democracies (but not in many developing countries or in authoritarian regimes), these private sector organizations threaten the health and well-being of citizens at least as much as government.

Read Full Paper

  •  
  • 1 of 2
  • »

The Analytics Playbook for Cities: A Navigational Tool for Understanding Data Analytics in Local Government, Confronting Trade-Offs, and Implementing Effectively

The Analytics Playbook for Cities: A Navigational Tool for Understanding Data Analytics in Local Government, Confronting Trade-Offs, and Implementing Effectively

Abstract:

Amen Ra Mashariki and Nicolas Diaz, August 2020 

Properly used data can help city government improve the efficiency of its operations, save money, and provide better services. Used haphazardly, however, the use of analytics in cities may increase risks to citizens’ privacy, heighten cybersecurity threats, and even perpetuate inequities.

Given these complexities and potentials, many cities have begun to install analytics and data units, often head by a chief data officer, a new title for data-driven leaders in government. This report is aimed at practitioners who are thinking about making the choice to name their first CDO, start their first analytics team, or empower an existing group of individuals.

Download the full report

Full Text

Full Report Text

About the authors

Amen Ra Mashariki is the Global Director of the Data Lab at WRI. There he works across programs, international offices and centers to identify data solutions to turn big ideas into action to sustain our natural resources—the foundation of economic opportunity and human well-being. Amen Ra is also a Data + Digital fellow at the Beeck Center for Social Impact and Innovation where he partners with academic, public and private sector thought leaders to shape best practices and strategies on how to use data to maximize impact in our communities.

Prior to this, Dr. Mashariki has served as Head of Machine Learning at Urbint, adjunct faculty at NYU’s Center for Urban Science and Progress (CUSP), Fellow at the Harvard Ash Center for Democratic Governance and Innovation, Head of Urban Analytics at Esri, Chief Analytics Officer for the City of New York, Director of the Mayor’s Office of Data Analytics at the City of New York, White House Fellow, and Chief Technology Officer for the Office of Personnel Management of the United States.

Amen earned a Doctorate in Engineering from Morgan State University, as well as a Master and Bachelor of Science degree in Computer Science from Howard and Lincoln University respectively.

Nicolas Diaz Amigo is a graduate from the Master in Public Policy program at Harvard Kennedy School. He has had research and fellow roles at the Bloomberg Harvard City Leadership Initiative and at digitalHKS. Professionally, he served as the first Coordinator of Public Innovation at the Mayor’s Office in Santiago, Chile, City Hall and has consulted for cities and local government organizations across the world on the topics of public sector innovation, data analytics, digital transformation, performance management, budgeting, and process improvement.

About the series editor

David Eaves is a lecturer of Public Policy at the Harvard Kennedy School where he teaches on digital transformation in Government. In 2009, as an adviser to the Office of the Mayor of Vancouver, David proposed and helped draft the Open Motion which created one of the first open data portals in Canada and the world. He subsequently advised the Canadian government on its open data strategy where his parliamentary committee testimony laid out the core policy structure that has guided multiple governments approach to the issue. He has gone on to work with numerous local, state, and national governments advising on technology and policy issues, including sitting on Ontario's Open Government Engagement Team in 2014–2015.

In addition to working with government officials, David served as the first Director of Education for Code for America — training each cohort of fellows for their work with cities. David has also worked with 18F and the Presidential Innovation Fellows at the White House providing training and support.

Acknowledgements

This report has been possible with the sponsorship of digitalHKS, a project at Harvard Kennedy School that studies the intersection of digital technologies and governance. The authors are grateful for the generosity of the public servants who provided case studies. Additionally, many individuals provided invaluable feedback to earlier version of this report, including   Kadijatou Diallo, Tommaso Cariati, Westerley Gorayeb, Natasha Japanwala, Lauren Lombardo, Michael McKenna Miller, Nagela Nukuna, Emily Rapp, Naeha Rashid, Imara Salas, and Blanka Soulava. This report is an independent work product and views expressed are those of the authors.

About the Ash Center

The Roy and Lila Ash Center for Democratic Governance and Innovation advances excellence and innovation in governance and public policy through research, education, and public discussion. By training the very best leaders, developing powerful new ideas, and disseminating innovative solutions and institutional reforms, the Center’s goal is to meet the profound challenges facing the world’s citizens. The Ford Foundation is a founding donor of the Center. Additional information about the Ash Center is available at ash.harvard.edu.

This research paper is one in a series published by the Ash Center for Democratic Governance and Innovation at Harvard University’s John F. Kennedy School of Government. The views expressed in the Ash Center Policy Briefs Series are those of the author(s) and do not necessarily reflect those of the John F. Kennedy School of Government or of Harvard University. The papers in this series are intended to elicit feedback and to encourage debate on important public policy challenges.

This paper is copyrighted by the author(s). It cannot be reproduced or reused without permission. Pursuant to the Ash Center’s Open Access Policy, this paper is available to the public at ash.harvard.edu free of charge. 

Executive Summary

Properly used data can help city government improve the efficiency of its operations, save money, and provide better services. Used haphazardly, however, the use of analytics in cities may increase risks to citizens’ privacy, heighten cybersecurity threats, and even perpetuate inequities.

Given these complexities and potentials, many cities have begun to install analytics and data units, often head by a chief data officer, a new title for data-driven leaders in government. This report is aimed at practitioners who are thinking about making the choice to name their first CDO, start their first analytics team, or empower an existing group of individuals.

In the following pages, we provide city leaders with:

  • Definitions. To understand what an analytic team in city government looks like, as well as presenting alternative team structures in the realm of data, digital government, and innovation.
  • Principles. The attitudes and mindsets that are useful when trying to enact ambitious change in local government, such as ensuring executive buy-in and seeking allies outside of government.
  • Plays. The organizational, strategic, and tactical considerations that a city must think about for starting and supporting an analytics team. These are presented in a sequential order: how to set up a team for success, building up from scratch, managing the day-to-day, as well as important considerations (like privacy and ethics) that should be present throughout.

1. Introduction: The potential use and misuse of data analytics in city government 

Imagine a world where our cities not only deliver the services that we expect from them, from weekly trash collection to running our schools to supporting local businesses, but also tailor those services to each of its citizens. Like Netflix recommending the next show you should watch based on your previous views or Amazon steering you to a product you didn’t even know you needed, local governments could connect citizens who request a service with other social programs they qualify for.

Imagine a city hall not only reacting to problems — a student dropping out of school or a building catching on fire — when they occur, but using data to identify risks and take preventive measures before they happen.

These are not pipe dreams. Municipal departments already possess an abundance of data, from utility bills to tax records to school enrollment histories, though the value of that data is often locked away in silos. City analytics, a discipline that combines operations research and data science,has the potential to unlock the big insights behind the big data and move local governments toward higher efficiency and effectiveness in the provision of services and the prevention of social ills.

When it comes to the use of advanced analytics, the public sector needs to catch up to the progress seen in the private sector over the past few decades. But strictly copying private sector practices is not the right approach. There are considerations for dealing with data collected by a public institution. With all the potential analytics hold for improving the delivery of local services, its risks must also be carefully examined by analysts, the senior leaders who oversee their work, and civil society at large. A misuse of analytics may pose cybersecurity threats, threaten privacy, or reinforce existing inequities.

Lessons from NYC: Breaking silos increased the need for cybersecurity

To exemplify the risks of misusing analytics, let’s take the example of cybersecurity. A few years ago, New York City was considering improving its monitoring system for underground assets such as sewage pipes that were being tracked by several agencies, each with its own software system and database. Bringing all of those together into a unique system would have allowed for more efficient management. But while a breach of any particular databases would not be particularly catastrophic, having the city’s entire underground mapped by bad actors would be a major threat. The standards for how safely data is stored and shared internally had to be reexamined.

Doing analytics in government also comes with unique barriers, such as the difficulty of hiring individuals with the right technological background or moving an organizational culture that does not always welcome innovation.

But that doesn’t mean there is no path forward. In fact, an increasing number of cities are beginning to systematically incorporate analytics, often in the form of new teams headed by leaders with titles like chief data officer, chief analytics officer, or chief performance officer. This playbook examines successes and failures, and shows what hard choices have so far been made along the way. An analytics team must face difficult questions that do not have clear answers: Who do we hire first? How do we prioritize our projects? How much time should we spend communicating our successes? By illuminating some of the trade-offs cities have faced, we hope this document will help city officials find a strategy that works for their own context, limitations, and objectives.

In sports, it is important for a coach to understand the wide variety of plays that may come in handy in different circumstances. A playbook provides a sense of orientation in a context of high complexity. What should happen at the start of the game, what do we do when we are in overtime, etc. But just like in sports, understanding the generic plays is not enough to win the game. It’s all in the execution and flexibility in adapting to whatever resources are available.

The lessons in this playbook rely heavily upon Amen Ra Mashariki’s experiences as head of the Mayor’s Office of Data Analytics (MODA) in New York City, where he led a team of nine analysts and policy specialists. To illustrate some of the lessons throughout this document, we will call out these insights in boxes like the one on the previous page. But we have also put these ideas to the test by consulting some of the leading professionals in the field, who have generously provided case studies that illustrate the potential and barriers of urban analytics.

One thing readers should be wary of is the temptation to think that every problem can be solved with data and technology. Using analytics heedlessly may even make some problems worse. The intention of this report, therefore, is to allow for a critical application of sophisticated tools. We will address several dimensions of those tools including learning how to identify which problems are suited for an analytics approach, understanding potential issues, setting up the right team to tackle those issues, and making sure the team gets the right resources for success.

This document is structured as follows. We start by defining what an analytics team is and what it is not. Then we highlight principles that guide successful teams. Finally we will highlight specific plays: the key trade-offs, questions and frameworks that an analytics team in local government must consider in order to effectively pursue its mission.

This playbook won’t give you all the answers for starting an analytics team. Instead, it is are meant to orient you toward questions that allow you to connect the work you want to do with whatever is important and feasible right now. The skill sets of your employees, the legal framework you are operating under, and even the political cycle will mold your journey. We have tried to offer some thoughts and ideas, but ultimately you will have to decide which plays to focus on at each time. To get better at the game, you have to start somewhere.

2. Definitions

In their desire to move toward data-driven organizations, well-intentioned public managers may fail to fully grasp what they are getting themselves into. Not all organizations need an analytics team, but by providing a clear sense of what analytics teams are and what they are trying to accomplish, we hope to make it easier for senior officials to choose whether to pursue this path.

2.1. What an analytics team is

Analytics teams may take different names and vary in size, scope, and mission across different jurisdictions.[1] Our basic definition of an analytics team in local government is a specialized group that applies data analysis to uncover insights with the hope of improving the operations of city departments.

In practice, what we most often find is a team between one and 20 employees that has been formally integrated in the bureaucracy. This team will leverage data-intensive tools such as machine learning or mapping to define, evaluate, and propose solutions to public problems. Its mission statement is usually some variation of “improving the outcome of what the city does to create more public value for constituents at lower cost” — though the most pressing areas to tackle to deliver public value will change depending on where you are and whom you ask.

2.2. What an analytics team is not

Other data-driven management and digital-government adjacent groups can take these forms:

  • A traditional information technology (IT) team that maintains technology assets and keeps systems running
  • A performance management team that works across departments to define the relevant metrics for success
  • A digital service team tthat ensures interoperability of the data, such as allowing members of one department to easily access data from other departments
  • An open data team that maintains an online portal aimed at sharing city data sets with the general public
  • A data governance team that sets citywide standards and convenes analysts across the city to build a citywide data culture

Note that these functions can be complementary to the work of an analytics team. In some places there may be overlap (for example, a chief data officer may set open data policies), and cities may choose to prioritize some of these over others.

3. Principles

While no single blueprint can work for every city, you will find throughout this report some common principles that underlie any effort to bring more analytical thinking to local governments. Teams should:

  • Commit to better service delivery. An analytics team must be driven by a desire to make government work better for its citizens in a transparent and accountable way.
  • Be both the disrupter and the listener. Given the nascent state of analytics in government, analytics teams may have to act like internal entrepreneurs who question the way things are done. But this desire for disruption must be paired with a profound curiosity to understand the nuances and challenges that led to the status quo.
  • Start with the problem, not the solution. An analytics team should avoid the temptation to run around with fancy tools in search of places to apply them. Instead, its strategic outlook and tactical moves should be centered around pressing challenges faced by the city or its individual departments. And when a need is found, a sense of practicality and viability should supercede the drive for sophistication. Simple is better than complex.
  • Ensure executive buy-in. As talented and persuasive as data-driven public servants may be, support from key stakeholders at the top is essential. Whatever your specific institutional arrangement, buy-in from your mayor, her chief of staff, the city manager, or someone sufficiently high up will allow you to overcome the resistance to change you’ll encounter along the way.
  • Work in the open. The work should not be top secret. The projects undertaken should be considered part of a conversation about public service delivery that is appropriately communicated within city hall and with the public. Blog posts can help explain the aim of each project and code can be posted online for scrutiny. A a commitment to transparency should ensure that those seeking information about a process or its results can easily access it.
  • Find allies. You will require allies both inside and outside the organization. In this playbook we will discuss in detail ways you can engage with department heads to create quick wins, when to extend ties to academia, and how the press can come in handy.
  • Quickly deliver value, but think long term. The team will need to rapidly demonstrate what it is bringing to the table and deliver quick wins. However, team members must also be aware that to create even more value they need to think about long-term investments in infrastructure and changes to the work culture. An effective manager must carefully consider both time frames simultaneously. 

4. The plays

 

4.1. Setting up the right team for success

This section begins by looking at the most relevant part of your analytics effort: the people behind it and how they should be organized.

4.1.1 Step zero: Making the case for data analytics

How do you argue the need for an analytics team?

Before beginning to establish an analytics team, stakeholders may need to be persuaded. Whether it is a city administrator, a councilwoman, or the general public, a convincing case must be made that a new team will help the city reach its goals.

Jane Wiseman, CEO of the Boston-based management consulting firm Institute for Excellence in Government, in studying cities investing in analytic efforts,[2] points toward multiple types of value that can be attributed to data teams:

  • Fraud detection cost savings
  • Efficiency improvements that reduce costs
  • Accuracy improvements that reduce costs
  • Increased revenue capture
  • Efficiency improvements that improve outcomes
  • Operational changes that increase safety
  • Increased faith in government due to more transparency

Naturally, not all stakeholders will care equally for every one of these. A city administrator may be more concerned with cutting costs than with transparency, for example. It is important to tailor the message in a way that considers particular stakeholder interests.

On the issue of improving city finances, a McKinsey study[3] on the use of data analytics in government to find fraud found that the return of investment could be as high as 15 times the cost. Wiseman also points toward two municipal data teams that self-reported their return on investment (ROI): The Louisville Metro Government calculated a five-to-one return for analytics efforts and the Cincinnati Office of Performance and Data Analytics calculated a $6.1 million added value to the city in two years of operation, a nine-to-one return.

4.1.2 Defining structure: Decentralized or centralized?

Should the analytics team be in one department or agency or spread out among several?

An analytics team may be a single central office that takes on projects across the city (centralized model) or a collection of employees  who are spread among various departments (decentralized model).

To decide between the two models, a useful rule of thumb is to think about what types of projects would generate the most immediate value for senior sponsors.

If you want to use data to tackle an issue that is handled by a single department (e.g., crime prevention, which is the responsibility of the police department), then a decentralized model, with analysts that sit permanently in that department, will allow for more sophisticated and in-depth research.

On the other hand, if your aim is to address a challenge where different agencies share responsibility (e.g, you want to know which buildings to target for inspection in a city that splits code enforcement tasks among the fire department, housing, etc.), then you should establish a centralized, dedicated unit. Having analysts who work with multiple departments will avoid task duplication and also reap the benefits of breaking down data silos instead of having each department create from scratch its own models with partial data.

In “Data-Driven Government: The Role of Chief Data Officers,”[4] published by the IBM Center for the Business of Government, Wiseman offers a systematic comparison of the centralized and decentralized models for government agencies, as seen in the following table.

 

 

Centralized

Decentralized

 

Chief data officer (CDO) focus

  • Analytics resources in a single team under the CDO’s leadership
  • CDO team provides data and analytics services to key executives and managers, functioning as an internal consultant
  • Team works in partnership with executives and managers to define scope and project needs
  • Some bureaus or agencies — typically the better resourced or statistics-driven ones — may have their own analytics resources, but the majority rely on the centralized CDO team
  • CDO team creates distributed capacity across government by embedding talent in bureaus through training and coaching
  • CDO team creates tools, platforms, and data standards that speed adoption of data skills in bureaus
  • Each bureau/agency/office is responsible for developing its own analytics capability, which can range from budget and policy analysts who can complete basic descriptive statistics and dashboards to analysts with the skill to perform data science tasks such as predictive analytics

 

Ideal for

  • Specialized skills or subject matter expertise needed for highly technical work
  • Analytics projects that require a high degree of confidentiality, such as investigations
  • Large-scale enterprises with similar or low-complexity operations spread across many bureaus, divisions or geographies
  • Agencies with a high level of existing data skill or broad adoption of data literacy, such as scientific or statistical agencies

 

Benefits

  • Centralized pool of analytics talent allows sharing of specialized skills across the enterprise from a common hub, which saves money since highly trained analytics staff can be expensive for government
  • Efficiencies are gained via peer support and collaboration among team members
  • Centralized team is better able to standardize tools and processes across government, which can save time and money and help develop deeper expertise in the chosen tools and methods
  • Team can facilitate cross-organizational data initiatives due to its enterprise-wide view of data assets and needs
  • Leaders and managers in bureaus have more control over their analytics resources, may get more timely responses to their requests, and may also more immediately deploy analytics insights
  • Analysts embedded in bureaus develop subject matter expertise that makes them valuable to their leadership and speeds time to results
  • Embedded analysts can foster greater adoption of data culture across enterprise, which can lead to faster organizational culture change
  • Skills gained in self-service analytics are transferable across government, spreading benefit

 

Limitations

  • Slow growth of sustainable analytics talent in the bureaus
  • Can be challenging to achieve scale with a small centralized team, as surge capacity may need to be deployed for a high-priority task
  • Putting decision-making and control of analytics in the hands of bureau heads leads to uneven attention and results, with some investing heavily and others giving it low priority or not appointing an analytics officer unless compelled to do so
  • Decentralized model limits peer cohorts for data-focused employees and may results in a more limited career ladder

 

 

 

 

 

Case study: Chicago’s CDO has a centralized mandate[5]

The first U.S. city to have a chief data officer (CDO) was Chicago. After briefly being part of the mayor’s office, the position was then moved to the Department of Innovation and Technology.

Tom Schenk Jr. served as the city’s second CDO and as deputy director of IT, which gave him both a centralized mandate to drive data analytics and the operational responsibility for maintaining and upgrading the city’s databases and digital platforms.

This centralization of responsibilities allowed Schenk and his team not only to push for concrete data analytics projects, but also to take control of data governance. They began building a data inventory of the entire city and optimizing the data infrastructure for easier analytics in the future.

Being able to inventorize the data was essential for setting up future successes. However, Schenk warns that quick wins are an absolute necessity. “Stuff will get hard and you will need to ask a lot of people [for help],” he says. “An inventory of data will take a lot of time and effort, and both residents and internal stakeholders will not perceive its value until you show some progress and real, tangible results.”

Schenk and his team also had to consider how to choose which projects to spend time on. To decide, the team worked closely with the University of Chicago, creating a framework to screen projects and assess the data maturity required for meaningful analysis in conjunction with the master of science program in Computational Analysis & Public Policy.[6]

A longer write-up of this case can be found in the annex.

 

4.1.3 Who to hire

Who is the first person you hire for your analytics team? What about the second, third, and fourth?

There is no predetermined profile or career progression for the team members in an analytics office. One reason for that is the need for teams that cover a wide variety of tasks, from project management and performing complex analyses to knowing how to build effective cooperation with the domain experts and more.

An article published in the Harvard Business Review in 2019 noted that data science teams in the private sector often attempt to discover “profound new business capabilities,” and because of this such teams must be made up of generalists who are focused on learning and iteration rather than specialists who do only one thing efficiently.[7] This argument can be made even more strongly for local governments because of the immense quantity and complexity of the services they provide.

For your first team member, consider looking for someone within the organization who has some local experience, rather than hiring out of a prestigious economic-analysis consultancy or data science firm. Even if this is a job that will require a good understanding of analytical methods, the first employee’s main challenge at first will be effectively navigating the tacit idiosyncracies of a political and organizational environment. An urban planner at the transportation department, for example, may have better insight into both the current challenges the city is facing and whom to partner with to get stuff done. Keep your eyes open.

When considering further hires, around analytics offices across the world, it is common to find professions as diverse as:

  • Data scientists
  • Urban planners
  • GIS managers and analysts
  • Political scientists
  • Social science researchers/analysts
  • Mathematicians and statisticians

Some skill sets that will be valuable regardless of previous occupation will be empathy (to understand the challenges of others), communication (to work effectively in collaborative teams), and curiosity (to always be looking out for status quo thinking that needs to be challenged).

Keep in mind that cultural diversity adds to the strength of the team (especially if it mirrors the diversity in your city) and will help avoid analytic projects with blindspots in issues such as race, differences among neighborhoods, etc. For more on this, see the passage on algorithmic bias in section 4.4.2, “The pitfalls of analytics.”

4.1.4 How to hire

How do you overcome some of the challenges in attracting talented and motivated individuals to local government?

Once you start looking outside the organization, it will not be easy to hire who you want. Young, ambitious professionals well-versed in analytical skills may feel more at home in a small start-up than in a big, clunky bureaucracy. A McKinsey article about trends in the workforce[8] highlighted the importance millennials put on flexibility, the availability of mentor relationships, and the autonomy to tackle their own projects — qualities not usually associated with the public sector.

Moreover, there is the issue of money; usually the public sector has strict rules about how much to pay people according to their place in the organization chart, and the average starting salaries for data scientists is often higher than what most agency heads earn — so highly skilled professionals will be offered significantly less than what they could earn elsewhere.

One advantage that you do have is the ability to offer impact and purpose. You can structure your portfolio of projects in a way that gives each analyst the opportunity to work on complex urban challenges that may affect the lives of thousands of citizens, while bringing forth new methodologies and ideas. Team members may also have access to high-level individuals in the city. But this also requires you to create a workflow with the proper autonomy for everyone on your team.

Local universities and colleges are an important resource. Whether you are located in a small city or a large metropolis, there should be a college nearby with a data science program. A few universities even offer specialized programs, such as the University of Chicago’s Master of Science in Computational Analysis and Public Policy[9] and New York University’s Urban Analytics track for its Master of Urban Planning.[10] At the very least, your local educational institution should have a statistics program. If searching locally bears no fruit, you might look into specialized networks for connecting the public sector with academia such as the MetroLab Network[11] and the Data Science for Social Good Fellowship at Carnegie Mellon University.[12]

Lessons from NYC: Accepting a faster changing team

At times, the traditional model of structuring a team in local government may need to be reconsidered. In New York City, Amen knew that if he hired young people he would not be able to retain them for long. So he kept a team with high turn-over and a frantic schedule of projects, offering his team the ability to work with autonomy on high-stakes issues.

           

 

Effectively engaging educational institutions means giving them access to interesting data sets and projects. For professors and students, getting their hands on real data for analysis is incredibly enticing. Graduate students must often do capstone projects with real-world clients in order to graduate. Finally, you may use this as an opportunity to create a pipeline for talent, offering student internships that may turn into full-time jobs by graduation. Regardless of the specific approach you take, hiring should be a continuous task. You may have to be constantly looking at resumes, nurturing your relationships with academia, going to classes to meet with students, looking for interns, etc.

4.2. Starting from scratch

Once in place, your analytics team will need to get started quickly. This will mean sorting out difficulties and finding a way to demonstrate value quickly. This section goes in depth into those crucial first steps.

4.2.1 How to find your first project

 How do you spot a “minimum viable product” ?

Your first analytics project is important. It will provide an opportunity to prove the value of the team to leadership and to senior officials throughout the city, so you should start not with a solution in mind but with a real problem that the city is facing — ideally, a challenge that has been clearly identified as a priority by leaders.

To further build the case for the analytics team, it should be a problem that can be tackled by integrating data sets from across multiple agencies. This will show the value of breaking data silos and will probably uncover problems of poor data collection. Furthermore, problems where the data is scattered are often complex policy issues that require deeper thinking — something that may have been ignored so far.

Finally, this cannot be an intellectual exercise only. Although there is plenty of value in visualizing a problem that may have been hidden from the view of stakeholders, to truly cement your first project as a success you should work with your partners toward a clear strategy for delivery. Do not assume that this will happen on its own. Talk with partnering departments beforehand to clarify how analytical results can help with the implementation or piloting of operational innovations.

From the very beginning, identify partners across the entire city government who see the worth of your work. They may be senior officials or new employees. Regardless, make them your data champions. Empower them. But do not forget to communicate to the top; the senior advisor or chief of staff to the mayor may be very busy in her day-to-day, but she should still be frequently updated on what you are doing.

 

Case study: Piloting London’s Office of Data Analytics[13]

In 2016 a pilot began for the London Office of Data Analytics (LODA). It was orchestrated by the Greater London Authority; 12 of the 36 London boroughs; Nesta, an innovation NGO; and ASI, a data science firm.

LODA found its first project by asking for suggestions from the participating boroughs. Then, a workshop session was conducted to collaboratively assess the projects based on:

  • Money saving potential
  • Availability of data
  • The ability to produce insights and delivering results within two months
  • The ability to solve the problem without personal data sets

After the issue selection process, the pilot focused on one use case: leveraging predictive analytics to identify multiple occupations in houses that did not have the appropriate licenses. By combining data sets from multiple sources, the team sought to point inspectors toward infractors.

Ultimately the project was unable to produce meaningful insights, but it did help inform a protocol for data-sharing among the boroughs.

A longer write-up of this case can be found in the annex.

 

4.2.2 Making the case for more funding

How do you argue for more money?

Every budget appropriation process is different, as the financial, political, and technical idiosyncrasies play out. However, the more clarity there is in the analytics team’s mission, the easier it will be to iterate that mission and construct a compelling narrative. It helps if the mission is closely aligned with the intentions of senior leaders. Often, analytics teams get stuck trying to do a little bit of everything, which muddles the argument for recurring funding.

You should be aware of the downside to following performance indicators, though, since important outcomes are often not measured. Performance indicators will usually point you toward things that are already being done by each department, potentially blinding you to exploring solutions that may not fit neatly within the existing bureaucratic structure. On the other hand, focusing on improving upon existing solutions may make the potential implementation more straightforward. This is a tension that you will have to navigate constantly.

Once you have finished your first project you may choose to scale your work in either vertically, by growing the complexity of the analysis you provide to your partners (i.e., moving from describing trends in data sets to predictive work to prioritize resources), or horizontally, by growing the number of agencies with which you are collaborating.

Lessons from NYC: Look at performance indicators

A logical alternative would be to make sure the projects that the team implements are tied to performance indicators. For example, when Amen was leading the New York City team, they only worked on projects that were part of the yearly quantitative goals that had to be reported to the mayor’s office and to the city at large. If a department had a proposition, the analysis had to have a logical and explicit way to move the numbers toward a goal. This discipline made it easier for the analytics team to track its impact in terms of city priorities met and dollars saved, thus making the case for growth.

 

As the analytics team begins to find a comfortable place in the organization, it should consider how to transition into a sustainable workflow that delivers the most value. This section helps to illustrate what you can expect the team’s day-to-day to look like. What sort of projects can the team offer? How does it prioritize projects? How should other departments be brought into the fold? And, how do you shift from merely reacting to thinking about the long term?

 

4.3.1 Understanding the multiple uses of analytics: Repertoire of actions

What can you use analytics for?

Once an analytics team is in place and has one or two projects under its belt, it should begin to look for other ways it could add value to ongoing projects, and how to let other departments understand what services the team can offer to help them do their work.

The table below is adapted from work performed by New York City’s Mayor’s Office of Data Analytics, and includes the categories of data analysis projects and real-use cases for a variety of city needs.

 

 

Why would we want to do it?

Example application

Prioritizing

Where to go first?

Ranking a list according to certain criteria can enable more efficient use of resources. Useful when [getting to the worst things]? earlier can mitigate potential negative effects.

To assist the Department of Education’s work to make all schools ADA-compliant, MODA used DOE data to prioritize which schools to renovate in order to reduce the number of students with disabilities who needed to use buses 

 

Scenario Analysis

What if?

Considering alternative events and their possible outcomes can help policymakers find the best course of action and plan for a greater range of possible policies.

As part of the Mayor’s Office of Long Term Planning and Sustainability’s research for a new commercial composting policy, MODA predicted how much waste local businesses would generate under various  regulatory thresholds.

Anomaly Detection

What is out of the ordinary?

Some processes can be improved by identifying and investigating outliers. Useful when looking for the exception is more feasible than examining every case.

Registration records of all kinds may have a number of files that display unusual characteristics. Flagging and examining those records may reveal procedural oversight or fraudulent transactions.

Matching

What goes with what?

Matching can optimally pair two groups against a certain set of constraints. Useful for equitably distributing limited resources.

When the appointment scheduler for IDNYC was backlogged with duplicate requests, MODA helped match applicants to times and locations based on indicated preferences.

Estimating

How much will a project cost?

Projects can be planned more effectively when time, materials, and costs are estimated in advance. Useful for quantifying the costs and benefits of new programs.

MODA worked with the Department of Housing Preservation and Development to estimate the resource requirements and program outcomes for a new set of Enhanced Contractor Review procedures.

Targeting

Where to look?

Targeting can narrow an operational domain to enable better resource allocation. Useful for identifying a subset for a specific intervention.

MODA created a model to help identify buildings that have displayed a pattern of unsafe living conditions. This enabled the Tenant Harassment Prevention Task Force to follow up with inspections and enforcement actions when necessary.

 

4.3.2 Picking the right projects

How do you choose what to work on?

At some point, you will have to choose between multiple projects that are asking for the analytics team’s resources and attention. Look for projects with the right:

  • Partners: Do you have buy-in from stakeholders to try new solutions and put insights into practice?
  • Data: Is there meaningful data that could lead to insights?
  • Impact: Will the analysis help illuminate a solution for a problem that is relevant?

Chicago’s approach was codified in a form given to technical departments that requested help from the analytics team,[14] allowing users to quickly assess a number of alternatives. Not every city necessarily needs to develop its own questionnaire, as the day-to-day of project intake may be more messy, but there should be some sense of which projects advance the analytics team’s mission and the overall goals of the administration.

Look for performance metrics. If, prior to the collaboration, there is some measurement of whatever problem is being solved or service is being improved, that will help make the case for the value of the team in the form of dollars saved, extra customers served, or whatever metric is relevant.

Consider scope. Some projects may be multi-year collaborations on complex policy areas, while others may have quicker turnarounds. You may want a combination of both in your repertoire.

Finally, mayoral priorities must be a consideration. Most mayors and city administrators will have a strategic plan or list of goals. These are a good start for finding projects that have a mandate to innovate and the appropriate resources for implementation.

4.3.3 Working with other departments

How can the analytics team be an effective partner to others?

By definition, a city’s analytics team is a partner to other departments.

This is important to keep in mind because it should dictate the attitude taken when relating to internal stakeholders. After all, if anything goes wrong, it’s the department that’s directly implementing a service that usually gets blamed, not the analyst. Avoid the negative spotlight. Don’t assume a project that is fascinating to you will be of interest to whichever commissioner or project director you want to pitch it to. Understand their priorities and avoid politically sensitive subjects when you don’t have the appropriate cover.

Instead, when approaching potential partners actively listen to their goals and concerns. When you do have an idea to bring up, make sure to frame it in a way that is attuned to their interests: “We would love to sit down and discuss how data could help you handle this problem or achieve this target.”

Another good practice in early exploratory meetings is to provide a variety of ways of helping. For example, you could:

  • Provide advisory functions upon request
  • Help research a particularly thorny question through data
  • Offer training to people in the department if they have to use data or a specific software tool in their day-to-day functions
  • Assist in developing a procurement strategy for tools that would best serve their need and allow for better analysis

Finally, remind your partners that you will be in the background and will do whatever it takes to make sure they get recognized for the success of the project.

4.3.4 Data stewardship

How do you think about data management?

Eventually, the team will need to start thinking beyond building effective collaborations that lead to fruitful projects and toward setting the right data infrastructure, or how data is managed, organized, and governed. This is known as data stewardship, and it may be vital to the long-term success of your analytics team.

Some cities, like Boston, have installed centralized data warehouses built in collaboration with contractors and maintained internally (see case study below). Such a tool may or may not be the best alternative for your city; what’s most relevant is the concerted and iterative effort to rethink the city’s data infrastructure, and how that can be connected to the mission of the analytics team.

 

Case study: Boston’s implementation of a centralized data warehouse[15]

The City of Boston has implemented its own centralized data warehouse, which serves as a repository for different databases that can be used throughout several departments. It was built over a period of three years with a contractor hired through a competitive bidding process. The project started small, encompassing a handful of databases, automating the loading of data into the system as much as possible, and expanding from there. Today, more than 30 departments have their data up and running.

Boston decided to make this investment in data infrastructure primarily for two reasons. First, to create a central repository (a single source of “truth”) and avoid the duplication of data across several departments. Second, to shift the work of data analysts from tracking down and cleaning data to adding value to the data by, for example, understanding the operational implications behind each project or ensuring the robustness of the analysis.

Boston’s warehouse has been instrumental in implementing one of the city’s priorities: its Vision Zero for eliminating fatal and serious car crashes, [16] which combines data sets from multiple stakeholders.

A longer write-up of this case can be found in the annex.

 

4.3.5 Pushing projects to production

How can you turn insights into value?

In whatever realm the analytics team in your city may hope to make a difference, it should ultimately strive to not only create insightful analysis, but also to develop products or tools that are useful for other employees of the city and that (either incrementally or radically) change their day-to-day operations. This requires more resources and a larger team, but it is the key to delivering impact and innovation.

In trying to address this challenge, the boss at Transport for London’s data team (see case study below) uses philosophies that are not always common in local government: agile development — a way of organizing work around iteration and quick prototypes — and the introduction of product management to oversee the continuous and iterative improvement of the applications that generate business value, as opposed to project management where projects have start and end dates.

Case study: Transport for London and Leveraging Data Products[17]

As the chief data officer of Transport for London (TfL), Lauren Sager Weinstein heads a team of 70 people, including data scientists, product managers, software developers, and data architects. The team is relatively new in the organization, and part of its mission is to centralize the creation of tools that use the vast amount of data generated by London’s transportation network — from traffic information to traffic-signals data to costumer data — while preventing the creep of siloed data tools that traditionally didn’t interact with one another.

TfL’s analytics team is particularly concerned in creating organizational change, since if there is no connection to actual operations that will be affected, then there is no real business value. To do this, they have adopted strategies such as:

  • A  product management mindset, where product managers continue to work closely with operational experts who use the results of an analytics project
  • An agile methodology to their projects, which includes the creation of minimal viable products with defined outcome metrics
  • A commitment to transparency, privacy, and clear communication of their work and its value to the general public

A longer write-up of this case can be found in the annex.

4.3.6 Branding and communications

How should you communicate the analytic team efforts?

You need to be proactive about communicating the work of the team.

One reason is that an effective branding strategy will help you with the other challenges we have pointed out: attracting the right talent to the team, building meaningful partnerships with other departments, and securing appropriate resources.

Another reason is that there may be (often well-founded) skepticism and challenges to using methods such as predictive analytics in government. If these fears are not addressed they may paralyze the work of the team.

The most appropriate strategy for the analytics team will likely depend on a combination of the organizational choices and the current state of the conversation. It will require different approaches in different situations; depending, for example, on whether you are working on concrete operational issues or pursuing one of the mayor’s priorities, which is likely to have a marketing strategy of its own.

In any case, be prepared to explain in simple terms what you are trying to accomplish, while also having considered the implications of your work (see Section 4.4.2, “The pitfalls of analytics”).

As much as possible, make your data, code, and insights publicly available. Academic institutions will probably be willing to partner up and further analyze your results, which will help you spread the information to an even wider audience as well as validate your approach.

4.4. Important considerations

In this section we include other key concepts that will be vital for the analytics team. Although we offer only a brief explanation for each of these challenging concepts, we believe that the people behind analytics in government should keep open discussions surrounding how to remake government from a platform perspective, the importance of open data and sharing, the ethical considerations behind this work, and, above all, the understanding that when poorly managed, these projects can actively cause harm.

4.4.1 Open data

How can you use open data requirements as an advantage?

Across many jurisdictions, there is an increasing legal or political mandate for open data policies.[18] In New York City, for example, legislation approved in 2012 demands that all departments upload their data using open data standards. The push has often been led by civil society with the aim of increasing transparency and predicated upon established principles such as accessibility, timeliness. and completeness.[19] Governments may also have an interest in publishing their data as it may lead to civic innovations.

The call for open data within city hall, whether as a formal decree or an informal demand, can be a great way to speed up the potential for analytical work by giving you more data sets to play around with and build the data infrastructure of the city. But you should consider some caveats.

  • Not all mayors and city managers will be equally receptive (or pressured) to push open data. Therefore, the willingness of departments to comply may vary. Understand the history of your city’s open data efforts, the current landscape, and the applicable laws and regulations to open data, and adapt accordingly.
  • Even if there is a clear mandate, you should watch out for potential duplication. If there is a legal requirement, there may be an office separate from the analytics team that has been set up to help departments with open data compliance. Thus, certain departments may have to send around the data multiple times.
  • Adding data to an open data portal is costly. It will probably involve an active process of data cleaning to scrub away any personally identifiable information.
  • You will have to do some thinking around your users. Most open data efforts assume that there is a standard generic user and they tailor their efforts to whatever is most convenient to the city.

Lessons from NYC: Considering multiple consumers of open data

In reality, you may have different types of users, with different skill sets and different understanding of what data is useful or not. When Amen was in MODA, he commissioned the social impact firm Reboot to do a study that found at least six personas or types of users who were using, or could potentially use, their open data portal — mappers, liaisons, interpreters, explorers, bystanders, and community champions.

 

Several companies offer prepackaged open data solutions that may be cheaper than building your own portal from scratch, but one thing to consider before committing to one is whether that solution has the right users in mind. Another alternative may be inviting citizens to give input into the city’s open data policies; it’s more expensive but you will reap the benefits of a co-created solution that engages the community.

4.4.2 The pitfalls of analytics: Privacy, security, and algorithmic bias

What should you, the team, and city leadership watch out for?

As governments start to ingest ever-growing quantities of data, new pitfalls appear. The more granular, rich, and timely your data, the more it will grow in usefulness for analytics, and the higher the risk of potential breach or misuse. Entirely new literatures have emerged over the past decade surrounding the challenges of privacy, security, and bias in algorithms, and universities have created graduate programs for professionals to specialize in these areas. Here we only offer a glimpse of the general things to consider.

Privacy

There is an inherent tension between transparency and privacy, even when you are careful with personally identifiable information. As more and more information is made available in open data portals, concerns arise that something like data on parcels can be linked to individual behaviors. And even if you attempt to anonymize or de-identify certain data sets, studies show that information can be reidentified with a lot more ease than was once thought.[20]

It is important to listen, respond to the concerns of citizens, and ensure that protocols are put in place in accordance with national, state, and local laws. Understand too that agencies that produce and control data may have their own concerns. Different departments may have different privacy requirements and if you are trying to bring that information into a centralized system, you should be aware of all the limitations.

Cybersecurity

In recent years the number of cyber attacks targeting state and local governments has grown considerably in the United States and around the world. The role of the chief information security officer (CISO) has been getting more attention.[21]

This presents two relevant considerations for the analytics team. First, it should understand that bringing information together under a centralized data infrastructure may create additional vulnerabilities. In the United States, some data types (such as medical information) have specific standards for security that must be met.

Second, the team needs to understand who has the mandate to deliver cybersecurity, which depends on the city’s structure. In some jurisdictions this mandate may not even be realized yet and may need to be created. If there is an IT team that is separate from the analytics team and they has been charged with maintaining data security, that team should be a close partner in helping the analytics team plan the next step for the city’s data infrastructure.

Algorithmic bias

Many academic and journalistic articles have been published in the past couple of years regarding the fairness of algorithmic decision-making. Some worry that black box algorithms built on faulty data may perpetuate or even accentuate patterns of inequality already in place. Others retort that with the right precautions and oversight, data can lead to a fairer distribution of resources by being quicker to identify those who need them or by predicting where an intervention can have the largest impact.

Some local governments are taking steps. New York City enacted an ordinance to monitor its automated systems,[22] and states like Vermont are developing guidelines.[23]

While there are no simple solutions at the moment, a good starting place is transparency. Having a scrutable open-source library of every project and auditing any automated decision-making system in the city should help create some trust. One useful resource to consider is the Open Data Institute’s Data Ethics Canvas,[24] which encourages analytics teams to be thoughtful about the primary purpose of any project and consider those who may be negatively affected by it.

5. Summary: Data analyst, go out there and listen

In the summer of 2015, cooling towers were killing people in New York City. Legionnaires’ disease is a form of bacterial pneumonia spread through the inhalation of infected water vapors, and several sites in the South Bronx saw enough cases in July and August that it became the biggest outbreak of the disease in the city’s history. Public health officials mandated the inspection and disinfection of all cooling towers in the area, but there was no official record of where such towers were located.

            MODA was called to help.[25] There had been a dozen fatalities and the city was scrambling to identify and begin tracking all the existing towers. Many departments tried to get as much information as possible in a variety of ways, from creating a self-registration portal to dialing building managers to physically inspecting buildings, resulting in disparate data sets with no “ground truth.”

By quickly pulling together all the information gathered, MODA was able to create a single, reliable data set for the inspections to work from as well as predictive models to find the missing towers. An early machine-learning model had enough predictive power to find approximately 90 percent of all cooling towers in New York. This allowed the city to move quickly, stopping the spread of the disease in a matter of weeks.

Of course, not every project is so dramatic. Some projects may seem more mundane. Some may never be implemented. But they are all important. As MODA’s support in the Legionnaires’ disease crisis illustrates, adding more analytical capacity is increasingly important as cities must adapt to the growing complexity of whatever urban challenges they are facing, and ultimately provide value to citizens. Being able to find previously unseen value in your data sets, ensure that city operations are tied to increases in performance, prove the value of new policies with data, and understand the limitations and pitfalls of these approaches so that they are used responsibly will inevitably go from being seen by city leadership as nice-to-have to must-have.

6. Annex

6.1. Case study: Chicago’s CDO has a centralized mandate[26]

Tom Schenk Jr. became the second Chief Data Officer (CDO) to serve in Chicago, after having played roles in the private, public, and academic sectors and publishing a book on data visualization. His arrival came just after the department had just been moved from the mayor’s office to the Department of Information Technology (DoIT), a department with highly qualified staff but a lot of attrition, which had operational responsiblity for maintaining and upgrading every single database and major digital platform used by other departments. The appointment gave Schenk dual roles as both CDO and deputy director of IT.

Dual roles that converge

Because he had to understand the ins and outs of the systems that were serving the rest of City Hall, these dual responsibilities provided a tactical advantage. Tom and his team optimized their systems while also molding them to be more responsive to the needs of future analytics teams. This reduced the barriers between analysts and IT professionals, and whenever a database was hard to access, Tom could easily get his team to give access and solve the issue.

Being able to inventorize the data was essential for setting up future successes. However, Schenk warns that quick wins are an absolute necessity. “Stuff will get hard and you will need to ask a lot of people [for help],” he says. “An inventory of data will take a lot of time and effort, and both residents and internal stakeholders will not perceive its value until you show some progress and real, tangible results.”

As CDO, Schenk undertook a broad variety of projects, from predictive analytics to open data to automation. One of his most notable projects was a predictive algorithm that identified where kids were most likely to suffer from lead poisoning before they were 1 year old.

Selecting projects as a centralized analytics unit

As a centralized data analytics unit, how to choose which projects to spend time on was a constant question. The team worked closely with the University of Chicago. In conjunction with the Master of Science in Computational Analysis & Public Policy, they created a framework to screen projects and assess the data maturity required for meaningful analysis.[27] Although the criteria were very explicit, the project intake remained flexible as the team evaluated the availability of data, the potential impact, and the ability for the partnering unit to operationalize the problem, or what they would do after the analysis was conducted.

Effective projects through collaboration

Another publicized initiative was a predictive model for food inspection evaluations.[28] This model, which aimed to optimize the procedure for inspecting restaurants and resulted in critical violations being found seven days earlier, sprouted as part of a grant awarded by Bloomberg Philanthropies to Chicago as part of the Mayor’s Challenge.[29] It later became a collaborative partnership between the Chicago Department of Innovation and Technology (DoIT), the Department of Public Health (CDPH), the Civic Consulting Alliance, and Allstate Insurance.

Schenk also spearheaded the creation of a robust open data policy and hosted hackathons to promote the use of city data and get the community engaged.

An example of the difficulties of collaboration came from one project that intended to predict where illegal cigarettes sales occurred. Because of the nature of the issue, the relevant data were collected by both the county and the city, and getting different levels of government (which have different data collection and standardization practices) onboard required significant coordination. Because the project was not able to get to concrete results early on, it eventually got sidetracked and discarded.

6.2. Case study: Piloting London’s Office of Data Analytics[30]

In 2016 a pilot began for the London Office of Data Analytics (LODA). It was orchestrated by the Greater London Authority; 12 of the 36 London boroughs; Nesta, an innovation NGO; and ASI, a data science firm.

Although it was modeled after New York City’s Mayor’s Office of Data Analytics, LODA faced an additional layer of complexity because each of London’s Boroughs had its own council, government structure, and data sets without any obvious standardization.

LODA found its first project by asking for suggestions from the participating boroughs. Then, a workshop session was conducted to collaboratively assess the projects based on:

  • Money saving potential
  • Availability of data
  • The ability to produce insights and delivering results within two months
  • The ability to solve the problem without personal data sets

Data to tackle common challenges

After a multistage project selection process, the pilot focused on one use case: leveraging  predictive analytics to identify multiple occupations in houses that did not have the appropriate licenses. By combining data sets from multiple sources, the team sought to point inspectors towards infractors.

One of the main questions the pilot sought to answer was whether this methodology would be easily scalable to all boroughs of the city or to other policy areas. As an ultimate goal, this provided the opportunity to intelligently design shared services and coordinate the action of different teams.

Data scientists from an external consultancy were brought onboard, as the data analytical capacity inside the boroughs was extremely limited and their analysts could not commit full time to this project. Internal and external analysts worked together to identify relatable data sets from building inspections, noise complaints, tax bands, etc.

Learning from failure

The LODA pilot failed to produce meaningful results or a scalable methodology.

One of the main barriers was that the boroughs collected widely varying data, and when they did collect the same data, they used different formats. This forced the team to create individual models for each borough. Much time had to be spent processing, cleaning, and merging the data. Moreover, the data sets weren’t properly geomatched; that is,there wasn’t a unique identifier for each property that allowed the merging of data.

From a data science standpoint, it was hard to carry out a predictive model because the variable of interest was only half-labelled; the team knew for certain when some houses fell in the “unlicensed multiple occupation” category, but didn’t know for certain when a house was definitely not in that category. This made assessing the accuracy of the model difficult.

Other challenges cited by the pilot group included:

  • Data quality: significant effort was devoted to cleaning; inability to geomatch certain data sets
  • Data availability: private rental data were missing or hard to access
  • Data warehousing: some boroughs did not have centralized business intelligence units or data warehouses and the data had to be pulled individually from different sections of the organization
  • Rarity of the predicted variable: The variable of interest was too rare in certain boroughs, which made the construction of a predictive model hard
  • Lack of capacity: Lack of available in-house expertise in the boroughs to work in both the data analytical portion and the implementation that should have followed

After the pilot, an information sharing protocol was signed by 12 of the boroughs and Mayor Sadiq Khan announced the creation of a City Data Analytics Programme to build on the connections made possible by the data partnership between boroughs.

6.3. Case study: Boston’s implementation of a centralized data warehouse[31]

The City of Boston has implemented its own centralized data warehouse, which serves as a repository for different databases that can be used throughout several departments. It was built over a period of three years with a contractor hired through a competitive bidding process. The project started small, encompassing a handful of databases, automating the loading of data into the system as much as possible, and expanding from there. Today, more than 30 departments have their data up and running.

Investing in data infrastructure

Why is Boston investing in its data infrastructure? On one hand, the city avoids conflicting reports of data by establishing a single repository of reliable information, which creates trust. Before establishing a centralized warehouse, several versions of the same data may have existed in different departments. For example, if the 311 phone line (which in many cities is the main aggregator for citizen requests and complaints) gets reports about potholes in the streets, it may keep a separate list and then pass it on to the Department of Transportation, which is in charge of filling the potholes. The 311 division and DOT may keep separate data sets, perhaps with different attributes. (Has the complaint been inspected? Has the citizen been contacted when his case is closed?) If you wanted to understand the entirety of the pothole operational performance, you would have to track down several different data sets held by different people in different departments, perhaps even in different formats.

This helps clarify a second advantage of a centralized warehouse: as you move toward enacting advanced analytics — whether by building dashboards, running predictive models, etc. —  a centralized model will save your analysts will considerable time and effort and their results will be more reliable. Because they don’t have to spend tracking down and cleaning the data, steps that could have taken weeks or months turn into days, and the analytics team can spend its time on activities that really add value, such as working with partners to understand the operational implications behind the data or ensuring the robustness of the analysis.

Boston’s warehouse has been instrumental in implementing one of the city’s priorities: its Vision Zero for eliminating fatal and serious car crashes.[32] This plan, which involves several departments (transportation, public works, police, and more), is built upon reporting, dashboards, and geospatial visualizations that rely upon data that is also collected by several stakeholders. By leveraging the data warehouse, the analytics team at the Department of Innovation and Technology was able to quickly and frequently provide information regarding traffic patterns, interventions at streets, reported incidents, and more.

Maria Borisova, one of the city’s software engineers who has been overseeing the warehouse implementation from its inception tells us that the technical details are relevant but not that hard to figure out. It is often working with other partners that requires thoughtfulness. Plenty of times Borisova’s counterparts at other city departments may have concerns about centralizing their data, worrying about the  accessibility, reliability, and safety of the new system. Patience, starting small and building iteratively, and showing value to your counterparts is vital, she says.

6.4. Case study: Transport for London and leveraging data products[33]

As the chief data officer of Transport for London (TfL), Lauren Sager Weinstein heads a team of 70 people, including data scientists, product managers, software developers and data architects. The team is relatively new in the organization, and part of its mission is to centralize the creation of tools that use the vast amount of data generated by London’s transportation network — from traffic information to traffic-signal data to costumer’s data — while preventing the creep of siloed data tools that didn’t interact with each other.

Creating real business value through analytics

The data science team at TfL seeks not only to understand data but also to create meaningful products that add business value to the organization. This requires an understanding of TfL’s strategic priorities,[34] such as expanding bus service to outer London and reducing carbon emissions, as well as of the operational complexity of the network. While keeping constant communication with other divisions of TfL, a data scientist may find an insight that could be used to improve more processes. After a proof of concept to test its usefulness, a team of developers may built a software tool around it and a dashboard to measure its outcomes. One of the several product managers may be in charge of ensuring that the tool continues to fulfill its goal and hit its targets.

To accomplish this, the team must work with an agile methodology, building minimal viable products with defined outcome metrics. It is essential that any partner(s) for a given project should have a clear articulation of what analysis they need and what it will be used for. Without a connection to actual operations that will be affected, there is no real business value. For example, to execute the city’s Vision Zero,[35] which seeks to eradicate road deaths by 2030, the transit division wanted to understand where bus speeding was most frequent to then take the right preventive measures.

Another of the team’s recent undertakings has been a pilot to test whether WiFi usability data (collected at tube stations) could help the organization’s understanding of traffic patterns and help create measures to avoid congestion, improve operations, and prioritize investments. With data coming from over 500 million connection requests, the team prototyped a series of solutions, including the display of approximate train congestion for passengers waiting on the station and others. By using an agile approach, TfL has been able to test both the technical and business value feasibility, and can now begin to consider moving to a production version of some of the ideas that were tested.

It is also worth noting TfL’s commitment to transparency — the results of pilots are published online —[36] and clear policies regarding privacy and the use of personal information[37].

 

[1] Jane Wiseman. (2017). Lessons from Leading CDOs: A Framework for Better Civic Analytics

[3] Susan Cunningham, Mark McMillan, Sara O’Rourke, and Eric Schweikert, “Cracking down on government fraud with data analytics,” October 2018, McKinsey & Company.

[5] Interview with Tom Schenk (October 2019).

[15] Interview with Maria Borisova, Data Engineering Manager at the City of Boston (February 2020).

[17]  Interview with Laura Sager Weinstein, Chief Data Officer at Transport for London (March, 2020).

[26] Interview with Tom Schenk (October, 2019).

[31] Interview with Maria Borisova, Data Engineering Manager at the City of Boston (February, 2020).

[33]  Interview with Laura Sager Weinstein, Chief Data Officer at Transport for London (March, 2020).

Last updated on 10/06/2020

Federal COVID‐19 Response Funding for Tribal Governments: Lessons from the CARES Act

Citation:

Henson, Eric, Miriam R. Jorgensen, Joseph Kalt, and Megan Hill. 2020. “Federal COVID‐19 Response Funding for Tribal Governments: Lessons from the CARES Act”.

Abstract:

Eric C. Henson, Megan M. Hill, Miriam R. Jorgensen & Joseph P. Kalt; July 2020 

The federal response to the COVID‐19 pandemic has played out in varied ways over the past several months.  For Native nations, the CARES Act (i.e., the Coronavirus Aid, Relief, and Economic Security Act) has been the most prominent component of this response to date. Title V of the Act earmarked $8 billion for tribes and was allocated in two rounds, with many disbursements taking place in May and June of this year.

This federal response has been critical for many tribes because of the lower socio‐economic starting points for their community members as compared to non‐Indians. Even before the pandemic, the average income of a reservation‐resident Native American household was barely half that of the average U.S. household. Low average incomes, chronically high unemployment rates, and dilapidated or non‐existent infrastructure are persistent challenges for tribal communities and tribal leaders. Layering extremely high coronavirus incidence rates (and the effective closure of many tribal nations’ entire economies) on top of these already challenging circumstances presented tribal governments with a host of new concerns. In other words, at the same time tribal governments’ primary resources were decimated (i.e., the earnings of tribal governmental gaming and non‐gaming enterprises dried up), the demands on tribes increased. They needed these resources to fight the pandemic and to continue to meet the needs of tribal citizens.

Read the full report

Last updated on 07/24/2020

Emerging Stronger than Before: Guidelines for the Federal Role in American Indian and Alaska Native Tribes’ Recovery from the COVID‐19 Pandemic

Abstract:

Eric C. Henson, Megan M. Hill, Miriam R. Jorgensen & Joseph P. Kalt; July 2020 

In this policy brief, we offer guidelines for federal policy reform that can fulfill the United States’ trust responsibility to tribes, adhere to the deepest principles of self‐governance upon which the country is founded, respect and build the governing capacities of tribes, and in the process, enable tribal nations to emerge from this pandemic stronger than they were before. We believe that the most‐needed federal actions are an expansion of tribal control over tribal affairs and territories and increased funding for key investments in tribal communities. 

Read the full report

Last updated on 07/24/2020

Policy Prototyping for the Future of Work

Citation:

Gustetic, Jenn, Carlos Teixeira, Becca Carroll, Joanne Cheung, Susan O’Malley, and Megan Brewster. 2020. “Policy Prototyping for the Future of Work”.

Abstract:

Jenn Gustetic, Carlos Teixeira, Becca Carroll, Joanne Cheung, Susan O'Malley, and Megan Brewster; June 2020

The future of work will require massive re-skilling of the American workforce for which current policy “toolboxes” for economics, labor, technology, workforce development and education are often siloed and antiquated. To meet the needs of tomorrow’s workers, today’s policy makers must grapple with these interdisciplinary policy issues.

This report describes a novel design-driven approach we developed to create policy “prototype” solutions that are inherently interdisciplinary, human-centered, and inclusive for the future of work. Using our design-driven approach, we collaborated with more than 40 interdisciplinary and cross-sector thinkers and doers to generate 8 distinct policy prototypes to support the future of work.

Read the full report

Fiscal Strategies to Help Cities Recover—And Prosper

Citation:

Goldsmith, Stephen, and Charles “Skip” Stitt. 2020. “Fiscal Strategies to Help Cities Recover—And Prosper.” Ash Center for Democratic Governance and Innovation.
Fiscal Strategies to Help Cities Recover—And Prosper

Abstract:

Stephen Goldsmith, May 2020 

Despite robust economies, many local officials entered 2020 already worried about budget balances that looked fragile in the short term and problematic in the long term due to enormous pension and health-care issues. Today, in the wake of COVID-19, clearly federal support is necessary, but it is also apparent that it cannot alleviate all the pressures on communities as responsibilities related to the pandemic skyrocket while revenues plummet.

While many public managers will rightly deploy a host of tactical cost-cutting measures, the most creative among them will explore deeper and more strategic changes, such as those presented herein, which will help address the current crisis while preparing their cities for the future. This paper suggests a transition to a culture deeply focused on data, incentives for city workers to produce internal reforms, public-private partnerships that monetize operational excellence, and rapid adoption of both new technologies and good ideas borrowed from other jurisdictions. These more deliberate and strategic approaches may be harder to implement but those offered here need not harm incumbent public employees nor negatively impact cities’ efforts to ensure access and equity. Rather, the strategies we outline should strengthen the efficiency and mandates of existing government offices while helping make cities more resilient and better prepared for tomorrow’s challenges.

Read the full report

Last updated on 06/23/2020

2019 State of Digital Transformation

Citation:

Eaves, David, and Georges Clement. 2020. “2019 State of Digital Transformation”.
2019 State of Digital Transformation

Abstract:

David Eaves, Georges Clement; May, 2020

In June of 2019, the Harvard Kennedy School hosted digital service teams from around the world for our annual State of Digital Transformation convening. Over two days, practitioners and academics shared stories of success, discussed challenges, and debated strategy around the opportunities and risks digital technologies present to governments.

Teams that joined us for the summit used different approaches and methodologies in vastly different contexts. Some governments—such as those of Estonia and Bangladesh—were building on decade or more of experience refining already-advanced practices; others—such as the state of Colorado’s—were still getting ready to formally launch. Some had deep connections across their entire executive branch; others were tightly focused within a single agency.

Despite these differences, many key themes emerged throughout the convening. This paper contains reflections from the Summit. 

Read the full report

Dissecting the US Treasury Department’s Round 1 Allocations of CARES Act COVID‐19 Relief Funding for Tribal Governments

Abstract:

Randall K.Q. Akee, Eric C. Henson, Miriam R. Jorgensen, and Joseph P. Kalt; May 2020 

This study dissects the US Department of the Treasury’s formula for distributing first-round CARES Act funds to Indian Country. The Department has indicated that its formula is intended to allocate relief funds based on tribes’ populations, but the research team behind this report finds that Treasury has employed a population data series that produces arbitrary and capricious “over-” and “under-representations” of tribes’ enrolled citizens.

Read the full report

Last updated on 05/18/2020

Crisis Communications for COVID-19

Citation:

Leonard, Herman B. "Dutch", Arnold M. Howitt, and David Giles. 2020. “Crisis Communications for COVID-19”.

Abstract:

Herman "Dutch" Leonard, Arnold Howitt, and David Giles; April 2020

Communication with employees, customers, investors, constituents, and other stakeholders can contribute decisively to the successful navigation of a crisis.  But how should leaders think about what they are trying to say – and how to say it?

This policy brief lays out simple frameworks that can be used to formulate the messages that leaders can and should – indeed, must – convey to help their communities and organizations make their way forward as effectively as they reasonably can.

Read the full report

Last updated on 04/28/2020

Crisis Management for Leaders Coping with COVID-19

Citation:

Leonard, Herman B. "Dutch", Arnold M. Howitt, and David W. Giles. 2020. “Crisis Management for Leaders Coping with COVID-19”.

Abstract:

Herman "Dutch" Leonard, Arnold Howitt, and David Giles; April 2020

In the face of the rapidly evolving coronavirus crisis that demands many urgent decisions but provides few clear-cut cues and requires tradeoffs among many critically important values, how can leaders and their advisers make effective decisions about literally life-and-death matters?  This policy brief contrasts the current “crisis” environment with the more familiar realm of “routine emergencies.” It argues that for crises, leaders need to adopt a more agile, highly adaptive, yet deliberate decision-making method that can move expeditiously to action, while retaining the capacity to iteratively re-examine tactics in light of decision impacts. This method can help the team take account of the multiple dimensions of the COVID-19 crisis and cope as well as possible with swiftly changing conditions.

Read the full report

Last updated on 10/19/2020

Prioritizing Public Value in the Changing Mobility Landscape

Citation:

Goldsmith, Stephen, and Betsy Gardner. 2020. “Prioritizing Public Value in the Changing Mobility Landscape”.
Prioritizing Public Value in the Changing Mobility Landscape

Abstract:

Stephen Goldsmith and Betsy Gardner, January 2020

In this paper we will look at the values and goals cities affect with policies concerning connected mobility, and how to create a new framework that aligns with these objectives. First, we identify the transformative changes affecting cities and mobility. Second, we discuss in more detail the guiding values and goals that cities have around mobility with examples of these values in practice. Our next paper, Effectively Managing Connected Mobility Marketplaces, discusses the different regulatory approaches that cities can leverage to achieve these goals.

We recommend that cities identify various public values, such as Equity or Sustainability, and use these to shape their transit policy. Rather than segmenting the rapidly changing mobility space, cities should take advantage of the interconnectivity of issues like curb space management, air quality, and e-commerce delivery to guide public policy. Cities must establish a new system to meet the challenges and opportunities of this new landscape, one that is centered around common values, prioritizes resident needs, and is informed by community engagement.

In conclusion, cities must use specific public values lenses when planning and evaluating all the different facets of mobility. Transportation has entered a new phase, and we believe that cities should move forward with values- and community-driven policies that frame changing mobility as an opportunity to amend and improve previous transportation policies.

This paper is the first in the Mobility in the Connected City series.

Read the second paper "Effectively Managing Connected Mobility Marketplaces" 

Read the full report

Last updated on 02/05/2020

Effectively Managing Connected Mobility Marketplaces

Citation:

Goldsmith, Stephen, and Mathew Leger. 2020. “Effectively Managing Connected Mobility Marketplaces”.
Effectively Managing Connected Mobility Marketplaces

Abstract:

Stephen Goldsmith and Matt Leger, February 2020

As new innovations in mobility have entered the marketplace, local government leaders have struggled to adapt their regulatory framework to adequately address new challenges or the needs of the consumers of these new services. The good news is that the technology driving this rapid change also provides the means for regulating it: real-time data. It is the responsibility of cities to establish rules and incentives that ensure proper behavior on the part of mobility providers while steering service delivery towards creating better public outcomes. Cities must use the levers at their disposal to ensure an equitable mobility marketplace and utilize real-time data sharing to enforce compliance. These include investing in and leveraging physical and digital infrastructure, regulating and licensing business conducted in public space, establishing and enforcing rules around public safety, rethinking zoning and land use planning to be transit-oriented, and regulating the digital realm to protect data integrity.

This paper is the second in the Mobility in the Connected City series. 

Read the first paper  "Prioritizing Public Value in the Changing Mobility Landscape"

Read the full report

Last updated on 02/05/2020

Playbook: Government as Platform

Abstract:

Richard Pope, November 2019

Looking around the world, we can see a different approach to digital government. One of cross-government platforms that are beginning to break down organizational silos, save money and change the types of services that can be delivered to the public. This playbook is written for practitioners, from public sector product managers to chief digital officers, looking for approaches to implementing platforms in government. 

Read full paper

Last updated on 01/24/2020

An Analysis of the Council of Arab Economic Unity’s Arab Digital Economy Strategy

Abstract:

Edited by David Eaves, October 2019

In this report, experts analyze the Council of Arab Economic Unity's comprehensive digital strategy for the Arab region. While some countries have individually launched digital economy roadmaps in recent years, the Arab Digital Economy Strategy offers a new opportunity to consider the benefits and challenges of digital cooperation across countries. Specifically, this report details areas of concern and explores some potential resolutions to these challenges.

Read full paper

Last updated on 01/24/2020

A Fair and Feasible Formula for the Allocation of CARES Act COVID‐19 Relief Funds to American Indian and Alaska Native Tribal Governments

Abstract:

Randall K.Q. Akee, Eric C. Henson, Miriam R. Jorgensen, and Joseph P. Kalt; May 2020 

Title V of the CARES Act requires that the Act’s funds earmarked for tribal governments be released immediately and that they be used for actions taken to respond to the COVID‐19 pandemic. These may include costs incurred by tribal governments to respond directly to the crisis, such as medical or public health expenditures by tribal health departments. Eligible costs may also include burdens associated with what the U.S. Treasury Department calls “second‐order effects,” such as having to provide economic support to those suffering from employment or business interruptions due to pandemic‐driven business closures. Determining eligible costs is problematic.

Title V of the CARES Act instructs that the costs to be covered are those incurred between March 1, 2020 and December 30, 2020. Not only does this create the need for some means of approximating expenditures that are not yet incurred or known, but the Act’s emphasis on the rapid release of funds to tribes also makes it imperative that a fair and feasible formula be devised to allocate the funds across 574 tribes without imposing undue delay and costs on either the federal government or the tribes.

Recognizing the need for reasonable estimation of the burdens of the pandemic on tribes, the authors of this report propose an allocation formula that uses data‐ready drivers of those burdens.  Specifically, they propose a three‐part formula that puts 60% weight on each tribe’s population of enrolled citizens, 20% weight on each tribe’s total of tribal government and tribal enterprise employees, and 20% weight on each tribe’s background rate of coronavirus infections (as predicted by available, peer‐reviewed incidence models for Indian Country).

Read the full report

Last updated on 06/01/2020

Replicating Urban Analytics Use Cases

Abstract:

Craig Campbell, January 2019

At a 2016 meeting of leading municipal analytics practitioners and experts at the Harvard Kennedy School, Johns Hopkins GovEx’s then-director of advanced analytics, Carter Hewgley, assessed the opportunities for analytics replication: “The good news is that problems and opportunities in U.S. cities are similar, meaning there is unending replication potential,” he said. The bad news was that lack of good protocols for use case discovery, challenges accessing and standardizing data, and uneven investment in data-literate human capital make analytics use cases difficult to generalize and import into different cities. At a time when the value of predictive analytics is widely recognized as a tool for better decision making and “chief data officer” is an increas- ingly common title in municipal government, cities still face the same challenges adopting analytical models into routine operations they have faced for decades.

Read full paper

Reforming Mobility Management: Rethinking the Regulatory Framework

Abstract:

Stephen Goldsmith, January 2019

More people than ever live in cities, where the dominant mode of transportation continues to be single-occupant personal vehicles. This has created unprecedented burdens on city infrastructure and increased congestion on roads in urban centers. Increased congestion has resulted in greater greenhouse gas emissions, lower reliability of public transit systems, longer commutes, and an overall lower quality of living for citizens.

These challenges have created fertile ground for private-sector innovation within the mobility ecosystem. Thus far, the most significant private sector innovation in urban mobility has been ridesharing. Conventional wisdom attributes the birth of rideshare to the proliferation of smartphones and improvements in wireless connectivity and location data in cities. However, the ridesharing industry also relies on dependability and reliability of free public roads, which were a critical component in the development of the modern car-friendly city. Unfortunately, these same public roads lack the infrastructure to coordinate and interact with digital-first services as effectively as they coordinate the physical movement of people and goods.
 

Download paper

Last updated on 08/06/2020

Cooperative Procurement: Today’s Contracting Tool, Tomorrow’s Contracting Strategy

Abstract:

Scott Becker and Stephen Goldsmith, October 2018 

Increasingly, governments across the country are turning to cooperative procurement for greater value. Joining with other entities can significantly reduce administrative costs and leverage the benefits of economies of scale. In recent years, cooperatives have evolved to provide a wider variety of benefits to procurement officials and vendors, offering increasingly complex services adaptable to a growing participant pool. Expansion of offerings and targeted attention to best-in-class contracts have furthered their value proposition. This paper intends to provide insight into today’s cooperative procurement market, evaluate value propositions and challenges, and present strategies for success. 

Read full paper

Analytics in City Government

Citation:

Gover, Jessica A. 2018. “Analytics in City Government”.

Abstract:

Jessica A. Gover, July 2018 

How the Civic Analytics Network Cities Are Using Data to Support Public Safety, Housing, Public Health, and Transportation 

 

From remediating blight to optimizing restaurant inspections and pest control, cities across the country are using analytics to help improve municipal policy and performance. The continued adoption of analytics in city governments shows no sign of slowing, and as even more sophisticated tools such as machine learning and artificial intelligence are deployed, there is a critical need for research on how these practices are reshaping urban policy. By examining and capturing lessons learned from city-level analytics projects, practitioners and theorists alike can better understand how data- and tech-enabled innovations are affecting municipal governance. This report seeks to contribute to that developing field.

 

Read full paper

Last updated on 01/29/2020

Technology and Governance in Singapore’s Smart Nation Initiative

Abstract:

Jun Jie Woo, May 2018 

Decades of rapid economic growth and urbanization in Singapore have given rise to new and increasingly complex policy problems. Singapore’s policymakers have sought to address these problems by leveraging emerging technological solutions such as data analytics. This has culminated in the “Smart Nation” initiative, a nationwide and whole-of-government effort to digitize Singapore’s policy processes and urban environment. More importantly, the initiative has given rise to administrative reorganization and increased state-citizen engagement. These changes portend more fundamental shifts in Singapore’s governing milieu.

Read full paper

Like a Fish in Water: An Essay on the Benefits of Government That Nobody Notices

Abstract:

Steve Kelman, August 2017

In this paper Kelman discusses the role and importance of government in our society today. Debates over the role of government — often phrased as “big government” vs. “limited government” — are at the center of our political life. We debate the government’s role in health care and in regulating the environment. We debate levels of taxation. Yet there are crucial benefits of government that should be appreciated whether you are a person who thinks of yourself as liking government or not. These benefits come about because government has created an environment where we can in our everyday lives normally take the reliability and trustworthiness of others for granted. We flag down a taxi on the street, get into the driver’s car, and don’t worry this stranger might kidnap us. We walk on a sidewalk, and do not worry it will buckle beneath us. We drive a car, and do not worry the brakes won’t work. These are the unnoticed benefits of government, which we notice no more than a fish notices it is swimming in water.

Read full paper

Last updated on 03/01/2020
  •  
  • 1 of 3
  • »

Our Path to “New Normal” in Employment? Sobering Clues from China and Recovery Scores for U.S. Industry

Citation:

Cunningham, Edward, and Phillip Jordan. 2020. “Our Path to “New Normal” in Employment? Sobering Clues from China and Recovery Scores for U.S. Industry.” Ash Center for Democratic Governance and Innovation.

Abstract:

Edward Cunningham and Philip Jordan, July 2020 

The US National jobs reports for May and June exceeded expectations, and for many, this signaled that April was the true peak of American job losses and real recovery may be underway. Yet mounting evidence suggests that a job recovery is a long way off and that many jobs may not return.

Part of the analytic disconnect stems from the fact that the global pandemic is a novel challenge for policymakers and analysts. We lack current, useful benchmarks for estimating the damage to the labor market, for estimating what recovery would look like, and for measuring an eventual recovery in jobs. Given this paucity of models, one place to look for patterns of potential recovery – particularly relating to consumption and mobility – is China.

The Chinese economy is driven largely by consumption, urban job creation is driven by small and medium-sized companies, and China is several months ahead of the US in dealing with the pandemic’s economic and labor impact. An analysis of China’s experience may, therefore, offer important clues about our recovery here at home, and inform new models of thinking about American job recovery.

Read the full report

Last updated on 07/15/2020

Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time

Citation:

Cunningham, Edward, Tony Saich, and Jessie Turiel. 2020. Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time. Ash Center for Democratic Governance and Innovation.
Understanding CCP Resilience: Surveying Chinese Public Opinion Through Time

Abstract:

Edward Cunningham, Tony Saich, and Jessie Turiel, July 2020

This policy brief reviews the findings of the longest-running independent effort to track Chinese citizen satisfaction of government performance. China today is the world’s second largest economy and the Chinese Communist Party (CCP) has ruled for some seventy years. Yet long-term, publicly-available, and nationally-representative surveys in mainland China are so rare that it is difficult to know how ordinary Chinese citizens feel about their government.

We find that first, since the start of the survey in 2003, Chinese citizen satisfaction with government has increased virtually across the board. From the impact of broad national policies to the conduct of local town officials, Chinese citizens rate the government as more capable and effective than ever before. Interestingly, more marginalized groups in poorer, inland regions are actually comparatively more likely to report increases in satisfaction. Second, the attitudes of Chinese citizens appear to respond (both positively and negatively) to real changes in their material well-being, which suggests that support could be undermined by the twin challenges of declining economic growth and a deteriorating natural environment.

While the CCP is seemingly under no imminent threat of popular upheaval, it cannot take the support of its people for granted. Although state censorship and propaganda are widespread, our survey reveals that citizen perceptions of governmental performance respond most to real, measurable changes in individuals’ material well-being. For government leaders, this is a double-edged sword, as citizens who have grown accustomed to increases in living standards will expect such improvements to continue, and citizens who praise government officials for effective policies may indeed blame them when such policy failures affect them or their family members directly. While our survey reinforces narratives of CCP resilience, our data also point to specific areas in which citizen satisfaction could decline in today’s era of slowing economic growth and continued environmental degradation.

Read the full report

Last updated on 07/08/2020

China's Most Generous: Examining Trends in Contemporary Chinese Philanthropy

China's Most Generous: Examining Trends in Contemporary Chinese Philanthropy

Abstract:

Edward Cunningham and Yunxin Li, March 2020 

This annual report highlights leading results from the most recent data analysis of the Harvard Kennedy School Ash Center’s China Philanthropy Project, capturing over one-quarter of estimated national giving in China. We focus on elite giving by building an annual database of the top 100 individual donors, top 100 donors from corporations and other organizations, and also top university recipients of philanthropic giving.

In 2018, such Chinese giving:

  • was dominated by large organizations (most commonly corporations) rather than individuals,  
  • supported in large part central government policy priorities in the area of poverty alleviation,
  • revealed an intriguing new philanthropy-driven educational model in the country, and
  • remained fairly local in scope.

Read the report in Chinese 

Read the full English report

Last updated on 05/04/2020

China’s Role in Promoting Transboundary Resource Management in the Greater Mekong Basin (GMB)

Abstract:

Malcolm McPherson, March 2020 

This paper examines how China can improve transboundary resource management within the Greater Mekong Basin (GMB) through its participation in the Lancang-Mekong Cooperation (LMC). Such improvement would ensure the efficient management and equitable development of the basin’s natural resources and ecosystems.

Read the full report

Hong Kong: The Rise and Fall of “One Country, Two Systems”

Abstract:

William H. Overholt, December 2019

This is an extensively edited, updated and expanded text of a lecture given for the Mossavar-Rahmani Center for Business and Government at Harvard Kennedy School on October 31, 2019. From the origination of “one country, two systems” in 1979 to today, this paper analyzes the history of the unique relationship between Hong Kong, Beijing, and the world.

Read full paper

Last updated on 03/31/2020

Vietnam's Crisis of Success in Electricity: Options for Successful Clean Energy Mix

Abstract:

David Dapice, December 2018

As Vietnam looks to the future planning for energy production and use, careful analysis of the cost of the variety of power generation systems that can contribute to a successful energy mix is key. This paper is designed to assist in that effort and in providing information for the research for Power Development Plan 8.

Read full paper

Local Governance and Access to Urban Services in Asia

Abstract:

Shabbir Cheema, November 2018

This policy brief explores how democratic processes in local governance affect access to urban services in Asian cities, especially for marginalized groups. It is based on research conducted by a group of national research and training institutions in nine cities in five Asian countries as well as regional dialogue hosted and facilitated by East-West Center with the support of the Swedish International Center for Local Democracy (ICLD). Governance process variables investigated were local government resources and capacity; mechanisms for local participation, accountability, and coordination; use of information and communications technology (ICT); implementation and replication of good practices; and management of peri-urbanization. This brief outlines research findings that are applicable across countries at the city level.

Read full paper

Technology and Governance in Singapore’s Smart Nation Initiative

Abstract:

Jun Jie Woo, May 2018 

Decades of rapid economic growth and urbanization in Singapore have given rise to new and increasingly complex policy problems. Singapore’s policymakers have sought to address these problems by leveraging emerging technological solutions such as data analytics. This has culminated in the “Smart Nation” initiative, a nationwide and whole-of-government effort to digitize Singapore’s policy processes and urban environment. More importantly, the initiative has given rise to administrative reorganization and increased state-citizen engagement. These changes portend more fundamental shifts in Singapore’s governing milieu.

Read full paper

Counting all of the Costs: Choosing the Right Mix of Electricity Sources in Vietnam to 2025

Abstract:

David Dapice, November 2017

How rapidly will or could demand for power grow in Vietnam? What will interest rates be? Will the cost of generating plants go up or down, and by how much? What will the cost of each fuel be? Will the cost of carbon or other pollution begin to enter into investment decisions?

This paper will examine these questions. It will begin by looking at demand projections and investments in efficiency – getting more output per kilowatt hour used. It will then try to estimate the costs of building and running various types of generating plants in Vietnam over time. It will also use various costs of carbon to see if including these both as a source of global warming and as an indicator of local pollution changes the calculation. Changes in the domestic supply of gas will also influence the set of potential solutions, as will the declining costs of solar electricity and battery storage. In all of this it is the system or mix of investments that need to work, not any single investment.

Read full paper

Last updated on 03/01/2020

Can China Reduce Entrenched Poverty in Remote Ethnic Minority Regions?

Abstract:

Arthur N. Holcombe, June 2017

In this paper Holcombe discusses lessons from successful poverty alleviation in Tibetan areas of China during 1998–2016. In the period between 1978 and 2015, the World Bank estimates that over 700 million people have been raised out of poverty based on a poverty line of $1.50 per capita. It also estimates that about 48 percent of residual poverty in China is located in ethnic minority areas where top-down macroeconomic policies to reduce poverty have been least effective and where strategies to target poor ethnic minority households with additional financial, technical, and other support were not successful in overcom- ing cultural and other barriers to greater income and food security.

Read full paper

Last updated on 03/01/2020

Values and Vision: Perspectives on Philanthropy in 21st Century China

Abstract:

Anthony Saich & Paula D. Johnson, May 2017

Values and Vision: Perspectives on Philanthropy in 21st Century China is an exploratory study of philanthropic giving among China’s very wealthy citizens. Recognizing the increasing number of successful entrepreneurs engaged in philanthropic activity in China, the study explores the economic and policy contexts in which this philanthropy is evolving; the philanthropic motivations, aspirations and priorities of some of the country’s most engaged philanthropists; and the challenges and opportunities for increasing philanthropic engagement and impact in China.

Chinese (traditional) translation available here

Chinese (simplified) translation available here 

Read full paper

Last updated on 03/01/2020

Rakhine State: Dangers and Opportunities

Abstract:

David Dapice, May 2017 

The paper provides an updated assessment of the danger that the Rakhine state conflict poses to all of Myanmar in terms of cost in lives, international reputation, depressed FDI, ongoing violence and sectarian conflict. The author makes the case that settling the issue will require a strategy that extends beyond restoring security, one that offers a real possibility of success at a political and economic level. He offers that the path forward lies in enabling moderate local and central leaders to bring about a new idea of citizenship, enhancing local socio-economic prospects by investing in roads, power and irrigation, as well as by restricting illegal foreign fishing off the cost of Rakhine, and by extending health and education services throughout the province to all residents.

Read full paper

Last updated on 03/01/2020

Rakhine State: In Need of Fundamental Solutions

Abstract:

David Dapice, February 2017, revised April 2017

In this paper, David Dapice, considers the factors that are at the heart of the instability in Rakhine state and suggests options for approaching citizenship and mobility issues and for overcoming the constraints on implementing development in the state.

Read full paper

Last updated on 03/01/2020

Health Education in China's Factories: A Case of Embedded Education

Citation:

Zhang, Siwen, Hua Chen, Songyu Zhu, Jorrit de Jong, and Guy Stuart. 2017. “Health Education in China's Factories: A Case of Embedded Education”.

Abstract:

Siwen Zhang, Hua Chen, Songyu Zhu, Jorrit de Jong, and Guy Stuart, January 2017 

This case study focuses on HERhealth, the health education program within the HERproject as it was implemented in China from 2007 onwards . Based on reports supplied by BSR this case study documents the health education and its effects on the behavior of women who received the education in terms of improved reproductive health, personal hygiene, and safe sex practices.

Read full paper

Last updated on 01/24/2020

HIV/AIDS Prevention on Southern China's Road Projects: A Case of Embedded Education

Citation:

Zhang, Siwen, Hua Chen, Songyu Zhu, Jorrit de Jong, and Guy Stuart. 2017. “HIV/AIDS Prevention on Southern China's Road Projects: A Case of Embedded Education”.

Abstract:

Siwen Zhang, Hua Chen, Songyu Zhu, Jorrit de Jong, and Guy Stuart, January 2017  

This is a case study of the Asia Development Bank (ADB)-sponsored HIV/AIDS prevention program implemented at expressway construction sites in Guangxi province from 2008 to 2015 . The program delivered HIV/AIDS prevention education to migrant workers working at the sites, as well as to members of the communities near the sites.

Read full paper

Internal and External Challenges to Unity in Myanmar

Abstract:

David Dapice, December 2016 

A year after the election that gave an historical victory to the National League for Democracy, Myanmar faces a critical juncture. Ethnic war and religious strife stubbornly remain, democratic gains remain fragile and major challenges, from mineral and hydroelectric revenues, to land insecurity, to illicit drug production and use have yet to be tackled meaningfully. In foreign policy, a resurgent China has indicated that it intends to play an active role in settling conflicts along its border and perhaps further afield. Meanwhile, an expectant public looks for signs of progress from a new government that is still finding its way. This paper argues that the internal and external challenges faced by Myanmar are linked, and suggests that economic progress, unity and effective independence will remain elusive (or could decline) unless the leadership explores pragmatic solutions to ethnic and religious grievances and produces economic growth that is high, sustainable, and widely shared. Click here to read in Burmese version

Read full paper

Last updated on 01/24/2020

A Grand Bargain: What It Is and Why It Is Needed

Abstract:

David Dapice, October 2016 

Achieving sustainable peace in Myanmar requires a comprehensive effort which involves real negotiations between the army, government and remaining non-signatory armed ethnic groups. This paper makes the case that the resulting Grand Bargain will have to involve some degree of limited autonomy of states and natural resource revenue sharing. It will provide a basis for compensating armed groups on both sides for lost revenue, bringing revenues to Kachin state for development, and facilitating the NLD government’s investment in development spending for the rest of Myanmar in line with its priorities. Given the sizeable estimated total value of jade sales and the fact that, in the author’s estimate, official government collections for jade now amount to only 3% of sales, the paper examines possible modalities for taxing jade at a reasonable level and sharing these revenues in ways that makes durable progress possible. Several other key challenges to national unity are briefly addressed as well. Click here to read in Burmese version
 

Read full paper

Growing Apart? Challenges to High-Quality Local Governance and Public Service Provision on China’s Ethnic Periphery

Abstract:

Sara Newland, July 2016 

Often assumed to be an ethnically homogenous country, the People’s Republic of China (PRC) in fact has a substantial minority population with 54 officially recognized ethnic groups that comprise close to 10 percent of the population. Integrating these diverse groups, many of which have a centuries-long history of conflict with the Han Chinese, into a unified Chinese nation-state has been a core policy challenge for the Chinese Communist Party (CCP) since 1949.1 At first, these challenges were largely political and ideological. The CCP struggled to integrate minority elites, many of whom did not share a common language or culture with the overwhelmingly Han leaders of the CCP, into the party. They also sought to create political institutions that both respected local cultural practices and combined these diverse regions under a single, unified state, a challenge that the Soviet Union also had to confront.

 

Read full paper

Assembling China’s Carbon Markets: The Carbons, the Business, and the Marginalized

Abstract:

John Chung-En Liu, June 2016 

China is in the process to establish its national cap and trade program to limit greenhouse gas emissions. Besides the top-tier market design (cap-setting, auction rules, etc.), Chinese policymakers need to pay attention to how the new carbon market embed in the larger social contexts.This brief highlights that the Chinese government needs to engage seriously with three less-concerned actors—the carbons, the business, and the marginalized—to realize the full potential of the carbon market.

Read full paper

The Fed’s Tapering Talk: A Short Statement’s Long Impact on Indonesia

Abstract:

Muhamad Chatib Basri, June 2016

In this paper, Dr. Muhamad Chatib Basri, who was Indonesia’s Minister of Finance during the Taper Tantrum (TT) period, analyzes the response to the TT of the five hardest-hit countries, dubbed the “Fragile Five” (Brazil, India, Indonesia, South Africa, and Turkey), and describes how Indonesia was able to mitigate the negative effects of the TT so quickly and effectively. Dr. Basri’s account provides many insights in the realm of macroeconomic management amidst external shocks that should be quite useful to emerging markets as the Fed now contemplates raising interest rates, which could have the same impact as the TT. Dr. Basri wrote this paper while a Senior Fellow at the Ash Center for Democratic Governance and Innovation and is now in the Department of Economics at the University of Indonesia. 

Read full paper

  •  
  • 1 of 4
  • »