The recent global business mania around adoption of generative artificial intelligence (GenAI) has amplified interest in organizational data literacy efforts in industry. Data literacy, when scoped clearly and comprehensively, represents a spectrum inclusive of business, data and analytical acumen (inclusive of AI), as well as degrees of capability (literacy, fluency, and mastery). In the face of the well-known change management challenges, we believe that current discussion focused on the executive suite is not sufficient and must extend to the primary execution arm, which is the middle management layer in most organizations.
Keywords: data literacy, change management, people management, data-driven decision-making
The recent global business mania around adoption of generative artificial intelligence (GenAI) has amplified interest in organizational data literacy efforts in industry, where data literacy is defined as “the ability to access, critically assess, interpret, manipulate, manage, summarize, handle, present, and ethically use data” (Okamoto, 2017, p. 120), or more recently, “the ability to read, write, and communicate with data in contex" (Logan and Duncan, 2018, p. 3).
We authors note that data literacy, when scoped clearly and comprehensively, represents a spectrum inclusive of business, data, and analytical acumen (including AI), as well as degrees of capability (literacy, fluency, and mastery)—while also noting that the same tactics, questions, and concerns we discuss below will manifest when working toward increasing organizational capacity in any of these contexts. We will use ‘data literacy’ here as a term primarily representing foundational language and skill sets, but our discussion is not exclusive of the more advanced skills.
Data literacy is more important than ever today as GenAI has accelerated the data revolution alongside the massive increase in produced data (5Vs of big data: velocity, volume, value, variety and veracity, as referenced by Ishwarappa & Anuradha, 2015, pp 320–321, among many others) and computing advances. Indeed, organizations have a real opportunity to democratize the ability for employees at all levels of any organization to use data in decision-making and productive collaboration—but only if they are equipped to do so. Both data-driven decision-making and productive collaboration using more conventional analytical approaches (descriptive and inferential statistics, predictive modeling, optimization, traditional machine learning, etc.) and using GenAI require an additional degree of foundational data acumen (of diverse data sources and types) and heightened understanding of critical thinking, ethics, and bias. And, as always, an organization’s ability to extract value from these opportunities depends entirely on people—and will be specifically based on their business acumen as well as their comfort and confidence with the enhanced access and capabilities.
Organizational data literacy is first and foremost a change management exercise, similar in fundamentals to well-documented historical efforts to drive customer obsession, adopt lean practices, or encourage organizational innovation efforts. In each of those examples, change leaders benefited from charter documents or ‘manifestos’ that codified best practice as well as representative peer organizations reviewed in both startup (Blank, 2013; Blank & Dorf, 2020; Osterwalder, 2013, etc.) and established business literature (Carlson & Wilmont, 2006; Zunino, Suarez & Grodal, 2019). Today’s “Analytical Leaders” (to borrow a term from Davenport & Harris, 2017) utilize both high degrees of organizational momentum and structure around the technical disciplines of analytics and, importantly, the organizational champions to drive that momentum throughout the organization.
Leaders, however, do not operationalize change on their own. Specifically, middle managers are a critical link between a vision of change and the implementation and operationalization of that vision—especially important because democratized, data-driven decision-making requires fundamental behavioral change among a variety of stakeholders, and such practices represent departures from existing behavioral norms, which tend to overvalue intuition and judgment based on experience rather than on ‘what the data says.’ We authors believe that current discussion focused on the executive suite is not sufficient and must extend to the primary execution arm, which is middle management.
Middle managers are the link between the vision set by executives and the day-to-day operations and culture of an organization. They translate leadership's vision and strategic imperatives to their teams and can make or break an initiative in how well they communicate its value. Middle managers can take abstract ideas like ‘data culture’ or ‘becoming more data fluent’ and make them tangible on both technical and nontechnical teams, leading to employee engagement in the initiative. Middle managers also play a critical role in influencing upper management. Whereas executives lead the charge on the need for data literacy and fluency, middle managers report from the trenches on where the challenges and opportunities lie, enabling leaders to make informed strategic decisions. In organizations where leaders are not yet focused on the value of data literacy and fluency, middle managers can build these capabilities within their own teams and provide a proof of concept for leaders to expand the initiative more broadly. Managers provide essential feedback loops for both upper management and individual contributors, ensuring that the vision for data-forward organization cascades through all levels, and across functions.
A core tenet of successful change management is clear communication, including both the organization's rationale for encouraging change and a common understanding of external and internal nomenclature and terminology (Phillips & Klein, 2023, among many others). It is especially important to carefully consider change management and communication strategies when undertaking data literacy initiatives because there is widespread anxiety about job security in the age of AI. Workers may associate data literacy efforts with AI, even if AI literacy is not explicitly covered. In their 2024 Work in America Survey, the American Psychological Association found that 41% of workers feared that AI would make some or all of their job duties obsolete, with even higher rates among younger workers (50% for ages 18–25 vs. 31% for ages 58+) (American Psychological Association, 2024).
Individual contributors were less likely to believe that their employers would retrain them if their job were replaced by AI than middle managers (53% vs. 69%, respectively). With this statistic in mind, managers have an opportunity to present data literacy initiatives as company-sponsored personal development that will give employees an edge in the AI economy.
It is important to keep the change management strategies discussed above in mind when assessing employee’s data literacy. Regardless of the assessment method, it is critical to understand the current state of knowledge and skills within an organization before undertaking a data literacy initiative. Learning must be targeted to fill knowledge and skill gaps specific to the organization to maximize impact.
The majority of the literature around assessing data literacy is targeted toward students and educators. Cui et al. (2023) reviewed 25 articles on data literacy assessment, and only two were targeted toward working professionals. The other 23 articles were targeted toward students, educators, researchers, and data librarians. The two articles targeted toward working professionals were for decision-makers working in Tanzanian government institutions (Aung et al., 2019) and Estonian journalists (Kouts-Klemm et al., 2019). In both articles, participants were recruited to complete the data literacy assessments. Since the assessments were not mandatory and did not directly precede a planned data literacy initiative, the authors did not have to consider how the assessments would affect change management.
Data literacy assessments for individuals typically fall into two categories: self-assessment and objective assessment (Kim et al., 2023). The questions asked in self-assessments rely on personal reporting of levels of comfort and skills in various areas of data literacy (e.g., ‘Select the statement that best describes your comfort with data visualization’). Objective assessments, on the other hand, ask questions that confirm the test-taker’s knowledge of a subject (e.g., ‘Which of the following visualizations most effectively communicates change in revenue over time?’). Aung et al. (2019) and Kouts-Klemm et al. (2019) used a combination of self-assessments and objective assessments in their interviews with government decision-makers and journalists.
While researchers often develop their own assessment tools, numerous private or nonprofit organizations offer data literacy assessment tools. Most of the organizations offering these tools also offer data literacy education or consulting (Bonikowska et al., 2019). The following institutions offer assessment tools:
myDatabilities (myDatabilities, n.d.) offers a free, 10-minute self-assessment based on the competency mapping produced by Ridsdale et al. (2015).
The Data Literacy Project (Data Literacy Project, n.d.) provides a free, 10-question self assessment. Aryng (Aryng, n.d.) offers a 10-question assessment, which has a mixture of self-assessment and objective questions.
The News Literacy Project (News Literacy Project, n.d.) offers a free, 5-question objective assessment, asking questions about news headlines and visualizations.
More thorough, paid objective assessments are provided by Elements (Elements, n.d.) and QuestionMark (Cambridge Asessment Network and Questionmark Computing, 2021).
Individual objective assessments are a strong method for determining knowledge and are standard in educational settings, but they may not always be practical to deploy widely in a business setting. One downside to objective assessments as part of data literacy initiatives is that they may increase workers’ anxiety around job security by making them feel that they are being evaluated against their peers. Another issue is that both written questionnaires (objective and subjective) can be time-consuming for employees. This could lead to low completion rates and reduced enthusiasm for a new initiative.
If traditional objective assessments are deemed too onerous, skills can be estimated through the analysis of corporate artifacts (job descriptions, career ladders, performance reviews, training records, etc.).
The downside of relying on artifacts alone is that certain artifacts rely on self-assessment. What an individual claims verbally or on their resume is not always an accurate reflection of their knowledge and skills. People are also frequently hired or promoted without meeting all the requirements in a job description or level of a career ladder. Another advantage of driving data literacy efforts via middle managers is that they can often confirm and expand on skills assessments via artifacts. Technical managers in particular are usually well attuned to the technical skills of their team and may be able to report on their data literacy skills.
Each type of assessment has pros and cons that should be considered when deciding how to evaluate the skills and knowledge of individuals within an organization. Middle managers are well positioned to determine which assessment methods will be the most effective for their teams.
Once a manager understands the current data literacy levels of their team members, they need to determine what a sufficiently data-fluent team looks like. For example, while all employees may need a foundational understanding of how GenAI can be used to increase productivity, the marketing team may require a more technical familiarity in order to produce content, including proficiencies in data analytics and visualization. The manager can incorporate data education into employee development goals as part of the performance review process. Creating specific learning and development goals that align with team objectives helps everyone understand why the learning is meaningful and a good use of time.
To demonstrate the efficacy of a data literacy program requires continuously tracking the progress from ‘current state’ to a ‘future state.’ Each level of the organization may have a different measurement to improve their data literacy skills:
Executives can be measured on how data literacy has improved and enabled their business strategies.
Middle managers can be measured on their success at improving processes, retaining critical resources, and driving innovative execution.
Data practitioners can measure adoption rate of data products, measure their outcomes, and drive increased output against each business strategy.
More mature programs can measure return on investment of the program with clear financial impact to the organization.
With goals set, it is time to begin skill-building. Online resources, particularly courses, are an inexpensive place to start:
An organization may already have an employee learning platform such as Udemy or LinkedIn Learning that offers courses on data topics.
An excellent resource that offers both limited free individual accounts and paid licenses for businesses is Data Camp. Managers can assign coursework and lead discussions to help bridge skills gaps between team members.
Online courses are an inexpensive way for employees to acquire new skills, but more comprehensive approaches ensure that key learner personas are mapped to experiential learning and relevant programs will be designed for each persona, including middle managers. Examples of approaches to in-depth data literacy programs include:
Hiring a data literacy program leader to partner with internal learning and development experts with specializations in data literacy.
Providers like The Data Lodge offer programs with train-the-trainer bootcamps and peer communities of data literacy leads who are implementing their own change programs.
There are a variety of strong providers that provide both off-the-shelf and customized live workshops on a wide variety of data topics, such as Data Society, Correlation One, and DataLiteracy.com.
Some academic institutions offer nondegree training and customized learning programs such as those offered by Northeastern University’s Roux Institute in Portland, Maine.
Participants in custom learning programs should complete a data project that is tied to their specific role and objectives (Kondratjew & Kahrens, 2019). Managers should be involved in this process, ensuring that the project topic is relevant and will realize value for the team and a return on investment for the company. Whether the learning experiences are deployed on the scale of a small team or an entire company, it is critical that the content taught is tied to the needs of the organization and regularly connected back to use cases specific to the company (Fergusson 2022). For example, a team member who has just completed coursework on data visualization and storytelling might be assigned a presentation to stakeholders. Coursework establishes a foundation of knowledge—but managers need to help their team apply newly acquired skills to their work on an ongoing basis. Without consistent application, training will not have a long-term impact.
Another way for managers to help their employees improve their data literacy is through work assignments. For example, if the team needs a new dashboard, a manager can assign a team member who needs to learn this skill to work alongside someone experienced.
As is true for any new initiative, a culture where learning and experimentation are valued is necessary for mindset and behavioral change to manifest in team members’ regular work. Managers can celebrate creativity and risk-taking, clearly communicating that it is okay to fail as they experiment with new approaches and skills.
Once a manager has supported data learning and applied that learning to work assignments, it is time to integrate data-forward behaviors into the standard processes and operations of the team, and therefore into the culture, where data-forward behaviors consist of actions that emphasize the use of data to guide business decisions, such as prioritizing data governance in business processes, structuring data for effective visualization and using exploratory data analysis and visualization tools in decision-making conversations.. For example, a data-forward team may set a standard that you do not raise a problem or recommend a solution without data in hand to support your case. Performance reviews can require data to support the employee’s work performance for that quarter, and regular peer review work sessions can help the team share knowledge and new data-driven methods. Managers should look at the cadences of meetings and communication and find opportunities to integrate data-forward approaches and data-informed decision-making. Like any cultural effort, a strong data culture is only maintained through practice.
As current and former business executives themselves, we authors universally agree that change management in business contexts is most effectively articulated and evangelized through real-world case studies (also argued by Phillips & Klein, 2023, among many others). We present three examples of data literacy efforts that specifically target middle managers as part of the program.
Northeastern University’s Roux Institute (http://roux.northeastern.edu) in Portland, Maine, has a mission to jump-start the AI-enabled technology and life sciences economy of Maine and northern New England. To fulfill that mission, the institute works with partners in industry, government, and the nonprofit sector to help them transform for the AI economy. When the institute engages with a partner, we use our own Analytical LEAP© framework (Koloski et al., 2024) to help focus investments in experiential learning for training and upskilling workforces.
Analytical LEAP proposes a simple explanation of the organizational underpinnings required to quickly adapt to the data and AI revolution by benchmarking four areas against ratings of ‘launch,’ ‘explore,’ ‘accelerate,’ and ‘perform’:
Learning Culture: What is the evidence of continuous learning throughout the organization?
Ecosystem: Is the organization’s data strategy infused throughout all levels of the organization?
Analytical Architecture: Are there strong practices and associated technology that enable data usage throughout the enterprise?
People: Do teams and individuals have the requisite knowledge and skills to accelerate organizational progress using data, analytics, and AI?
It is our contention that the first three categories (Learning Culture, Ecosystem, Analytical Architecture) are enablers of an organization’s adaptation to the data economy, but the value-unlocking opportunity is in the fourth category—People. LEAP further drills into the People dimension to focus on requisite skill sets for the major categories of data-centric roles in an enterprise,
Senior Leadership (e.g., executives)
Consumers (e.g., line-of-business stakeholders, middle managers, HR)
Curators (e.g., data engineers, IT)
Practitioners (e.g., analysts, data scientists)
Data Citizens (everyone!)
and then helps organizations assess baseline skill sets for each persona. Finally, the framework maps the skill levels of each persona to learning experiences that will help them reach the next level of maturity.
While the framework itself helps to target investments, the primary sources of information to feed the framework are the artifacts of people and data management, such as job descriptions, career ladder and performance review rubrics, center of excellence and analytics team charters, data governance and use practices, reporting norms and similar, all of which are often under the direct control of middle managers. It is in this area that middle managers exert a disproportionate effect on organizational data literacy.
Northeastern faculty at the Roux Institute used the Analytical LEAP Framework to design a 2-year, organization-wide learning plan for the more than 1,100 employees of Bangor Savings Bank (BSB), a mutually held community financial institution spanning northern New England. The learning plan is part of a 2-year initiative called Accelerating Insights, during which the Roux Institute at Northeastern is an embedded partner within BSB (Klausmeyer and Heffner-Cosby, 2024). The partnership has two goals. The first goal is to equip BSB’s employees with cutting-edge training in data literacy and the responsible use of AI. The second goal is for Northeastern faculty to create AI tools to accelerate work and help streamline routine tasks, giving employees more time for high-value customer interactions that go beyond routine transactions to create a deeper, more beneficial relationship.
For the education portion of the initiative, learning pathways were designed for each role in the LEAP Framework data community (senior leaders, consumers, practitioners, curators, and data citizens), ensuring that each employee received learning relevant to their role. The faculty’s discovery process for placing the bank in the LEAP Framework included the following steps: a kickoff with group brainstorming sessions with leaders (executive and middle managers), artifact collection and evaluation, and follow-up interviews with leaders and technical team members.
The first step in the assessment process was to conduct a kickoff with both senior leaders and middle managers. The kickoff brought middle managers into the process early, which helped gain their support for the initiative and enabled them to communicate its value to their respective teams. It was critical for middle managers to feel that they were part of shaping the initiative.
After introducing the initiative, faculty facilitated small group brainstorming sessions, where they asked participants to answer questions about data literacy and current usage within the bank. A few examples of questions asked are:
How are decisions on your team usually made (i.e., supported by data or intuition)?
What types of data are you currently underutilizing that could be leveraged to gain better insights to help your customers?
Which of your department’s priorities could be supported by clearer data insights or AI tools?
The brainstorming sessions gave faculty insight into the organizational data culture and ecosystem of BSB.
After the kickoff, institute faculty collected and assessed organizational artifacts, including job descriptions and training records to gain an initial understanding of employees' skills in data and AI. However, as discussed in Section 3, artifacts do not always provide a complete picture of the knowledge and skills of individuals.
As a next step, faculty considered conducting individual objective guidance. Leaders at BSB were aware that individual assessments could increase concern about the use of AI and undermine support for the initiative. Since buy-in on all levels is critical to the success of data literacy initiatives, the team decided that the cons of individual assessments outweighed the advantages and did not include them in the initial assessment.
To augment the information acquired by evaluating artifacts, faculty conducted follow-up interviews with a variety of nontechnical leaders (middle managers and executives). Talking with these leaders provided insight into learning culture, ecosystem, and analytical architecture of the organization, in addition to the skills of individuals. Middle managers provided insights into the following organizational categories in the LEAP Framework:
Learning Culture (e.g., ‘What data and AI literacy learning opportunities are provided by BSB? Are you given time during work to learn?’)
Analytics and AI Ecosystem (e.g., ‘How much does data factor into your day-to-day decision-making? Is data-informed decision-making widespread or siloed within certain teams? Do you have a collaborative relationship with a data practitioner who can help you with analyses? Are there areas where you could use more data?’)
Analytical Architecture (e.g., ‘Can you access the data you need? Do you trust it?’)
In addition to talking with nontechnical leaders, faculty conducted more in-depth technical interviews with technical managers and individual contributors. Because the bank’s leaders were so thoughtful in their approach to introducing the initiative, middle managers and individual contributors were willing to open up about their data literacy strengths, knowledge and skill gaps, and learning interests.
The information that the faculty gathered through the group brainstorming sessions, artifact evaluation, and follow-up interviews provided them with an understanding of both organizational maturity and the data-related skills of BSB employees. Northeastern faculty at the Roux Institute then used the Analytical LEAP Framework to determine which data literacy courses would be most impactful, given the current state and needs of the financial service organization and its people.
The process resulted in a 2-year learning plan for over 1,100 bank employees. Employees were assigned to pathways based on their role in the LEAP data community (senior leaders, consumers, practitioners, curators, and data citizens). Courses recommended for each group are specific to their needs, and vary in length by role. For example, data citizens will complete three short courses over 2 years, for a total of 4 hours of class time. The most technical group of data practitioners will complete seven courses over 2 years, with 50 total hours of class time. While the level and depth of coursework varies by group, everyone at the organization will learn basic data literacy.
At the time of submission of this article, the teams are 7 months into the 2-year initiative. Northeastern faculty presented the learning plan to both middle managers and executives at the financial services institution, with strong reception. Nine courses have been completed to date, with impressive survey results:
96.7% (95% CI [0.961, 0.973]) of employees were satisfied with their course
94.8% (95% CI [0.940, 0.956]) reported that they gained knowledge and skills
85.7% (95% CI [0.845, 0.869]) reported that the course would improve their job performance
Faculty are in the early stages of assessing course impact. Initial results are based on pre- and post-course surveys of learners who completed a 2-hour course in prompt engineering. The course emphasized using GenAI to increase efficiency in a safe and ethical manner. Participants took the postcourse follow-up survey about 2 months after the course ended. Highlights include:
96.2% (95% CI [0.791, 0.998]) of participants reported using generative AI in their daily work (versus 55.5%; 95% CI [0.356, 0.740] before the course)
92.6% (95% CI [0.742, 0.987]) reported that the experience increased their interest in further learning about analytics and AI
The Accelerating Insights Initiative is ongoing, and its success to date can be attributed to the engagement of middle managers at BSB. They were included in the process from the beginning, the course pathways were shaped with their input, and they were the first to participate in and give feedback on initial courses. Middle managers have introduced and celebrated the initiative with their teams, which generated widespread enthusiasm for the partnership that is reflected in the data above.
HarbourVest Partners is a global private markets firm with over 40 years of experience, headquartered in Boston and with a mission to make private markets accessible to all investors around the world. As the firm scaled significantly in recent years, it embarked on a new data and analytics strategy to maintain high performance in a rapidly changing data landscape. With over 40 years of data assets to maintain and leverage for insights, HarbourVest began a journey in 2021 to modernize all aspects of their data architecture, governance, and analytics. One of the firm's core values is ‘Great People.’ Executive leaders recognized that for the initiative to succeed, they needed to invest in the data fluency of their people alongside the technical data work. As the core data and analytics team was established, a dedicated full-time leader focused on firm-wide data fluency was necessary.
The team began with a needs analysis of roles across the firm, identifying personas and the data skills associated with each. It quickly became evident during this process that outside specific data roles, the necessary data skills for roles across the business were as variable as those roles themselves. However, it was clear that all colleagues needed a shared language to talk about data and an understanding of how to use data to realize business value. This led to the development of a multilevel, blended learning program design that included live class sessions with an instructor to teach necessary content, self-paced online coursework that mixed assigned courses with electives, and a project related to each person’s role and team objectives. This program was initially offered at two progressive levels, adding a third level in early 2024.Level 1–3 programs are between 8 and 12 weeks long, with an average of 2–3 hours per week spent on learning. By the end of 2024, more than 90% of the firm will have participated in at least one program (over 1,100 colleagues). Alongside this programming, more than 60 of HarbourVest's executive leaders participated in a series of data fluency leadership workshops on topics such as strong data culture, allocating resources to data work, and AI literacy to support them as they set the vision and path to execution for their organizations. Leadership workshops for both executives and middle managers are structured differently than the main programs. They take place over 2 half days, 4–6 hours total, and are live and instructor-led. The middle manager program focuses on data governance, AI literacy, and how to set the vision for a given team's data and AI future and plan team development toward those goals.
After successful pilots in 2022, the data fluency program expanded to the firm broadly in 2023. The team monitored program effectiveness through participant surveys and manager feedback, and they measured the business impact of projects in the upper levels of the program. They noticed some variation in outcomes across different teams, where some were able to leverage the training for impact on their teams and others were not. Participants who had a manager with a strong data context had a more positive experience and produced more impactful projects than those who did not. In this case, data context did not necessarily mean that the manager was in a data-focused role or had advanced skills themselves; it simply meant that they had proximity to the work of the data and analytics strategy and the value it was delivering, now and in the future. These managers already knew how to communicate the value of the training and the use of data.
The data fluency team recognized that they needed to ensure that all managers had that context, regardless of how closely they worked to the initiative currently. The team made a series of program revisions to increase manager involvement, which produced quick measurable impact, with 100% of managers reporting that the completed projects produced immediate or potential business impact. Measurement of the return on investment from the projects was also part of the program, with participants reporting how their projects saved the firm time or money, reduced risk, or increased revenue. Tracking these quantitative measures of impact helped middle managers communicate successes and opportunities to executive leadership.
In addition to supporting their team members as they go through training, the data fluency team launched a leadership workshop for over 120 middle managers in 2024. This workshop gives managers the tools to set visions for future data and AI opportunities on their teams as HarbourVest evolves its business, determines the level of data fluency needed by their team members to meet their goals, and creates plans for development and regular routines to get there.
Northeastern faculty at the Roux Institute also used the Analytical LEAP Framework to review the data skills inside the digital transformation organization at a global insurance company. The analysis showed that the organization’s inbound job sourcing artifacts were almost entirely devoid of the advanced analytic skills the organization’s management stated they desired for the future. Similarly, while the company’s career ladder artifacts were very robust in the areas of team collaboration and business domain knowledge, they made no mention of the ability to source, analyze, use, and communicate data in the context of decision-making. Further investigation revealed that the authoring responsibility for those artifacts was delegated primarily to line (middle) managers—but those existing managers did not necessarily have the data literacy background to expect the behaviors and skill sets of their people. It is clear that if organizations are not hiring, incentivizing, and promoting on the basis of the behaviors and skill sets they actually need for the future, those organizations will be far less equipped to maximize their value from the data economy. That said, it is not fair to assume that existing middle managers are equipped to implement those choices.
These issues are compounded by the inherently siloed nature of many organizations and legacy ‘metrics and reporting’–based approaches to the analytical function. In the recent past, many organizations offloaded data-centric activities (ETL: Extract/Transform/Load, analysis, reporting, forecasting, etc.) to a specialized team, either centralized or embedded in the lines of business. Those specialized teams acted as internal service providers, complete with (often unsatisfactory) service levels due to chronic underresourcing and overscheduling. Today’s democratization of data-centric tooling and computing resources makes it possible for a far wider array of workforce participants to self-service analytical outputs, but only if they have the requisite skills and the organizational support (technology, policies) to do so.
Recommendations that arose from the analysis included the establishment of a leadership role focused on analytics, creating an internal analytics community of practice, deploying specific coursework aligned with persona-based cohorts of members of the digital transformation organization with associated internal credentialing, and the creation of analytics business partner roles to link the internal community of practice to lines of business.
The linchpin here is the middle manager. Internal focus groups held at the insurer mentioned above revealed that middle managers were aware of a large set of junior employees in ‘metrics and reporting’ roles—but they viewed them primarily as reactive sources for the ‘question of the day’ rather than strategic allies in building a more data-literate team. As a result, not only were the employees themselves underutilized and undertrained but the line managers had no incentive to upskill themselves or their colleagues to interact more effectively with these analytical functions. Operationalizing data literacy hinges very directly on the individual decisions made by middle managers.
When executives are unsupportive, the critical decision for middle managers is how to design a contextually appropriate approach to either change their minds or work around them. “How Executives Should Be Spending Their Time” (Lofgren, 2021) posits that executives should categorize work into quadrants to organize their time: “Quadrant 1: Urgent and Important.” “Quadrant 2: Important but Not Urgent.”; “Quadrant 3: Urgent but Not Important (to you)”; “Quadrant 4: Not Urgent and Not Important.” “Quadrant 1: Urgent and Important” is the desirable area of executive focus for the data literacy program. In order to get into Quadrant 1, middle managers must first execute a change management evaluation to provide messaging and drive the urgency and focus. Change management actions can include:
Identifying and addressing the specific reasons for the lack of executive support.
Clarify the data literacy program’s contribution to enabling corporate strategy.
Develop a view of WIIFT? (What’s in it for them?) for each executive.
Identify the influencers within each executive’s organization and across the executive team, and gain the influencers' support.
Highlight supportive executives and find out why they are supportive. Recruit the supportive executives to influence the unsupportive executive.
Present facts about how peer industry leaders are benefiting and present an analysis on what would happen if the program was not supported.
Give unsupportive executives some special attention such as ‘pre-wiring’ by providing key insight into what will be presented in wider audience meetings to get feedback early.
The outcome of the change management process should provide insight into which approach will be most effective. While doing the work to identify the best approach may be a daunting task, not gaining support could mean the end of the data literacy program. Upfront work can result in a more efficient implementation and reduction in ongoing challenges from ‘office politics.’
As argued above, clear and consistent communication and nomenclature can reduce concerns and misconceptions. Organizations must define data literacy within their unique context, establish the value of the program, describe what peers in the industry are doing successfully, and identify the implications of not executing the program. The messaging must be simple. Executives are short on time—give them clear value statements that include industry research, along with what is expected from their team. The most common concerns are cost, time, and understanding a measurable outcome.
Most large-scale programs can be broken down into smaller pieces or defined as a minimum viable product (MVP) to deliver a measurable successful outcome. As part of the change management process, align with a supporter to help solve a well-defined, well-measured business problem and produce a success story. The most effective way to gain support for programs is to have someone besides the program owner share a success story, and then to couple that story with the research of analysts, industry thought leaders, and market leaders to showcase the value of the program.
In Power to the Middle (Schaninger et al., 2023, p. 6), the authors point out “how middle managers are uniquely positioned close to the ground, but with a crucial connection to company strategy, enabling them to guide their organizations through periods of complex and rapid change.” The use of data has accelerated due to the advancement of GenAI. Data and data literacy are no longer only required for those who are data-focused but expanded to all levels of the organization. Executives are pushing to develop and execute AI strategies as quickly as possible, while all levels of the organization are using GenAI in their everyday jobs, but the middle manager must connect the dots from the executive vision to how data is used by all employees.
The integration of GenAI into the business landscape has not only highlighted the significance of data literacy but also underscored a pivotal shift in the operational dynamics of organizations. The essence of data literacy transcends the mere acquisition of technical skills to embrace a cultural change within organizations. This cultural shift, underpinned by the critical role of middle managers, is required in navigating the complexities of data-driven decision-making processes and fostering an environment where data literacy flourishes at all levels of the organization.
While the rapid evolution of AI and GenAI offers an unprecedented opportunity for organizational growth and innovation, the realization of these benefits is contingent upon a workforce that is not only proficient in data literacy but also adept at applying these competencies in a practical, ethical, and impactful manner. The role of middle managers emerges as both a conduit and a catalyst in this process, bridging the gap between visionary leadership and the operational workforce. By translating the abstract concept of data culture into tangible actions and expectations, middle managers play an instrumental role in operationalizing data literacy, thereby facilitating a seamless integration of data-driven practices into the fabric of daily operations.
Moreover, the strategic emphasis on middle managers as the primary agents of change underscores the necessity of equipping them with the requisite skills, resources, and authority to drive a data-centric cultural shift. This entails not only the identification and nurturing of data literacy within their teams but also the fostering of an environment where questioning, experimentation, and innovation are encouraged and celebrated. The evolution of middle management from traditional supervisory roles to champions of data literacy is indicative of the broader organizational transformation required to thrive in an increasingly data-driven world.
By prioritizing the development and empowerment of middle managers as the linchpins in the operationalization of data literacy, organizations can navigate the challenges and opportunities presented by the digital age, thereby ensuring not only their survival but their prosperity in an ever-evolving competitive landscape.
Coauthors Dan Koloski and Berkeley Almand-Hunter are employees of Northeastern University’s Roux Institute, which is the provider of the data literacy consulting and training regimes described in section 5.1. Case Study 1: Banking and section 5.3. Case Study 3: Insurance. Coauthor Caitlin Porter is an employee of HarbourVest, where she leads the data literacy learning and development process described in section 5.2. Case Study 2: Private Markets.
American Psychological Association. (2024, June). 2024 Work in America Survey. https://www.apa.org/pubs/reports/work-in-america/2024/2024-work-in-america-report.pdf
Aryng, (n.d.). Data Literacy Assessment. Retrieved March 10, 2024, from https://aryng.com/data-literacy-test
Aung, T., Niyeha, D., Shagihilu, S., Mpembeni, R., Kaganda, J., Sheffel, A., & Heidkamp, R. (2019). Optimizing data visualization for reproductive, maternal, newborn, child health, and nutrition (RMNCH&N) policymaking: Data visualization preferences and interpretation capacity among decision-makers in Tanzania. Global Health Research and Policy, 4, Article 4. https://doi.org/10.1186/s41256-019-0095-1
Blank, S. (2013, May). Why the lean startup changes everything. Harvard Business Review: The Magazine. https://hbr.org/2013/05/why-the-lean-start-up-changes-everything
Blank, S. G., & Dorf, Bob. (2020). The startup owner's manual: The step-by-step guide for building a great company. Wiley.
Bonikowska, A., Sanmartin, C., & Frenette, M. (2019, August 14). Data literacy: What it is and how to measure it in the public service. Catalogue no. 11-633-X — No. 022. Analytical Studies Branch, Statistics Canada. https://www150.statcan.gc.ca/n1/pub/11-633-x/11-633-x2019003-eng.htm
Cambridge Assessment Network and Questionmark Computing, Limited (2021). Questionmark Data Literacy by Cambrige Assessment. Web. Retrieved January 2, 2025, from https://ondemand.questionmark.com/delivery/open.php?session=0270539000270539&lang=-&customerid=405708&name=anon&group=public
Carlson, C., & Wilmont, W. (2006). Innovation: The five disciplines for creating what customers want. Crown Business.
Cui, Y., Chen, F., Lutsyk, A., Leighton, J. P., & Cutumisu, M. (2023). Data literacy assessments: A systematic literature review. Assessment in Education: Principles, Policy & Practice, 30(1), 76–96. https://doi.org/10.1080/0969594X.2023.2182737
Davenport, T. H., & Harris, J. G. (2017). Competing on analytics: The science of winning. Harvard Business School Press.
Data Literacy Project. (n.d.). Discover Your Data Persona. https://thedataliteracyproject.org/assessment/quiz/. Accessed March 10, 2024.
Fergusson, L. (2022). Learning by… Knowledge and skills acquisition through work-based learning and research. Journal of Work-Applied Management, 14(2), 184–199. https://doi.org/10.1108/JWAM-12-2021-0065
Logan, V. and Duncan, A. (2018). Getting started with data literacy and information as a second language. Gartner Research. Retrieved March 31, 2024, from https://www.gartner.com/en/doc/3892877-getting-started-with-data-literacy-and-information-as-a-second-language
Elements. (n.d.). How it works? Retrieved March 31, 2024, from https://www.dataelements.io/platform
Ishwarappa, & Anuradha, J. (2015). A brief introduction on big data 5Vs characteristics and Hadoop technology. Procedia Computer Science, 48, 319–324. https://doi.org/10.1016/j.procs.2015.04.188
Kim, J., Hong, L., Evans, S., Oyler‐Rice, E., & Ali, I. (2023). Development and validation of a data literacy assessment scale. Proceedings of the Association for Information Science and Technology, 60(1), 620–624. https://doi.org/10.1002/pra2.827
Klausmeyer, S., & Heffner-Cosby, A. (2024). Future-ready excellence: Bangor Savings Bank and Northeastern University’s Roux Institute’s partnership strengthens customer connections through data and artificial intelligence. Bangor Savings Bank. https://www.bangor.com/about-us/blog/roux/
Koloski, D., Blattner, Z., & Almand-Hunter, B. (2024, May 3). Analytical LEAP: A new framework for targeting organizational investment in workforce upskilling for the age of AI. Roux Institute at Northeastern University. https://roux.northeastern.edu/leap/
Kondratjew, H., & Kahrens, M. (2019). Leveraging experiential learning training through spaced learning. Journal of Work-Applied Management, 11(1), 30–52. https://doi.org/10.1108/JWAM-05-2018-0011
Kouts-Klemm, R. (2019). Data literacy among journalists: A skills-assessment based approach. Central European Journal of Communication, 12(24), 299–315. https://doi.org/10.19195/1899-5101.12.3(24).2
Lofgren, J. (2021, February 8). How executives should be spending their time. Forbes. https://www.forbes.com/sites/forbescoachescouncil/2021/02/08/how-executives-should-be-spending-their-time/?sh=2dc75ca2506b
myDatabilities. (n.d.). Data to the people. Retrieved March 10, 2024, from https://www.mydatabilities.com/
News Literacy Project (n.d.) News Lit Quiz: Can you make sense of data? . Retrieved March 10, 2024, from https://newslit.org/tips-tools/can-you-make-sense-of-data/
Okamoto, K. (2017). Introducing open government data. The Reference Librarian, 58(2), 111–123. https://doi.org/10.1080/02763877.2016.1199005
Osterwalder, A. (2013, May 6). A better way to think about your business model. Harvard Business Review. https://hbr.org/2013/05/a-better-way-to-think-about-yo
Phillips, J., & Klein, J. D. (2023). Change management: From theory to practice. TechTrends, 67, 189–197. https://doi.org/10.1007/s11528-022-00775-0
Ridsdale, C., Rothwell, J., Smit, M., Ali-Hassan, H., Bliemel, M., Irvine, D. Kelley, D., Matwin, S., & Wuetherick, B. (2015). Strategies and best practices for data literacy education: Knowledge synthesis report. Dalhousie University. https://doi.org/10.13140/RG.2.1.1922.5044
Schaninger, B., Hancock, B., & Field, E. (2023). Power to the middle. Harvard Business Review Press.
Zunino, D., Suarez, F. F., & Grodal, S. (2019). Familiarity, creativity, and the adoption of category labels in technology industries. Organization Science, 30(1), 169–190. https://doi.org/10.1287/orsc.2018.1238
©2025 Dan Koloski, Caitlin Porter, Berkeley Almand-Hunter, Stephen Gatchell, and Valerie Logan. This article is licensed under a Creative Commons Attribution (CC BY 4.0) International license, except where otherwise indicated with respect to particular material included in the article.