This is surprising. We keep hearing about how we are in the "knowledge age"...and that knowledge is our most important advantage as institutions, companies, and countries. Governments provide enormous investments in knowledge-related work: research, universities, etc. Essentially, knowledge is to the future of work as oil was (is) to the industrial world - i.e. the basis on which the economy is starting to run. And yet, organizations aren't effectively considering how data can be analyzed to provide new insights, learning, and knowledge growth. Or, if they are considering it, they don't appear to be doing so from a strategic, systems-wide level.
1. What should an organizational data strategy look like?
2. What are the skills/people that should comprise and analytics team?
I'm not too clear yet on question 1, but for the analytics team, I'd suggest:
1. Leadership should be involved - not to confine the work of the team, but to communicate its value to others in the organization. Insights generated by an analytics team will need to be championed within the university/corporation. Important developments will have greatest impact if they are scaled up systemically.
2. Data scientists. This is a vague term, but I use it here to mean those who define the structural/organizational data needs and to ensure that the right types of data are being collected to help analyze and improve learning
3. Data miners - these are the folks who run statistical analysis of data, seek patterns, etc. Obviously, they work v. closely with the data scientists
4. Sociologist/psychologist: people who understand the social (softer) aspect of interaction and knowledge growth/exchange. An algorithm is one way of analyzing a domain. Other approaches exist and need to also be explored.
5. Faculty/trainers: individuals who are able to augment discoveries or innovations by the data scientists/miners by offering a perspective based in the reality of teaching/learning or workplace settings.
Who else should be involved?
The CIO of, for instance, an university should go much beyond than keeping the systems running. He should present the old and new options to the scientific and pedagogical counsels, and should put forward academic analytics INCLUDING learning analytics.
I'm even thinking of answering for a job as the CIO of Universidade do Algarve in the hope of being able to do in the information systems what as a PhD lecturer in software engineering I'm not even allowed to influence!
I leave the question if there is the need for a new profile for universities CIO. Having a CTO in a university would make sense or would it may go against some of the existing research groups? Who should contribute to the strategy and who should lead? How to motivate/fight CIOs to extend their reach to new areas like LA?
There are multiple people and multiple levels of data mining, reporting, and analysis at my workplace. All units attend to issues related to student learning-- institutional research and accreditation, the president and provost, the various colleges and units,programs, IT, and Library Media Specialists, the various unions, and the students have access to reports and opportunities for input.
Experienced people and specialists in various fields are hired to assist with the design of the organizational data strategy. The various units participate in posing questions and planning, making decisions about what data needs to be gathered, mining, analyzing, interpreting, representing, and reporting the data. In addition, generalists are involved in interpreting data. Faculty have access to data and reports and assist in data interpretation and use through committees. Students compile electronic portfolios and they have access to data that is pertinent to their learning and they have multiple ways to provide feedback about how to improve the learning climate and culture. Citizens also have access to reports, and they weigh in on the interpretation of data. We don't always see eye-to-eye so we negotiate.
Organizational data mining and management is a human process. At my workplace, the team is dedicated to finding new ways to use the data to improve teaching, learning and situations of practice.
George writes.... "data scientists..., data miners, .... and faculty trainers need to be able to work closely with people. This is true. There is great diversity on our team, and members of the team are knowledgeable people. People have different roles in the system. We have seen turnover in our leadership so we work together to get things done. We have an excellent IT group. Our librarians are doing wonderful work. On the team we have plenty of people who are efficient at using a variety of tools and disciplinary lenses for solving problems. We apprehend patterns, comprehend complexity, and reframe situations, when necessary. As importantly, we are resilient, which means we are not easily discouraged when things do not turn out as we thought they would. We are curious and able to learn, relearn, and unlearn as necessary.
The biggest challenge our team currently faces are finding tools that are not clunky. We would like to find tools talk to each other. You probably call them integrated systems. But there has to be a way that the integrated system to be set up so that innovation is always possible. We want to be able to use new tools and new technologies in creative ways. It is not uncommon to hear busy people say: "I finished the task, but this "tool" is 'a piece of work'. It is unacceptable that we cannot link to the data in x report system and download reports within this system."
Thanks for recommending that we take a look at:
Analytics: The New Path to Value, Nov 30, 2010. MIT Sloan School of Management Review.
The article answers the question:
How are the smartest organizations are embedding analytics to transform insights into action?
I found several of the chapters particularly relevant to this thread. Chapter 4 provides some perspective on the system.....
I was not surprised to see that the most effective organizations begin with questions and make decisions about which data to gather before gathering it.
It seems logical that data can be used to support decision making increases as people get used to the idea and as data is archived and analyzed.
"The frequency with which analytics is used to support decisions increases as organizations transition from one level of analytic capability to the next. At the same time, analytics migrate toward centralized units, first at the local line of business and then at the enterprise level, while the portion of analytics performed at points-or-need and with IT remain stable."
Chapter 4 proposes multiple levels at which analytic adoption can exist within an organization. The list expands on the one that was generated in the previous post.
"Figure 11. Analytic adoption spreads throughout organizations in a predictable patter, as all respondents gained proficiency with functional analytics in the same order. The rate of adoption, as shown through proficiency, increases steadily and threshold levels support the analytic capability tiers""Brand and market
Workforce planning and allocation
Product research and development
Strategy and business development
Sales and marketing
Operations and production"
I was surprised when I read in chapter 5
"Organizations want data that is integrated, consistent, and trustworthy, which were the leading data priorities cited by our respondents."
Breadth"I found the Outline of the Information Agenda interesting:
"Outline for an information agenda The information agenda provides a vision and high-level road map for information that aligns business needs to growth in analytics sophistication with the underlying technology and processes spanning:
- Information governance policies and tool kits — from little oversight to fully implemented policies and practices
- Data architecture — from ad hoc to optimal physical and logical views of structured and unstructured information and databases
- Data currency — from only historical data to a real-time view of all information
- Data management, integration and middleware — from subject area data and content in silos to enterprise information that is fully embedded into business processes with master content and master data management
- Analytical tool kits based upon user needs — from basic search, query and reporting to advanced analytics and visualization."
So what do you think?
1. Profile of the CIO/CTO/CKO of course should be with leadership skills, skills in people management, expectation management, risk management, change management, and able to influence the top management in deciding on the analytics purposes and implementation for teaching and learning the university.
2. Back to basic - the usage of analytics should align with the business strategy and mission of the organisation in terms of teaching and learning. There's no point having vast data without knowing what you want from the data. Only from knowing what we want from the data (or Task), we can identify the methods of using the vast data to analyse the patterns further.
3. 'Project stakeholders' should include everyone (a member or 2 of each) from faculties, support departments, etc. so that each aspect of data usage will be covered and looked into, especially data that integrate/interlink across dept/faculties. For example: Students who tend to fail in catching up, may be having some issues with financial, registration records or language fluency, which other departments may have records of. Stakeholders (in steering committee of this organisational Learning Analytics 'project') should be able to provide the requirements on what are needed from the analytics, from beginning of the project.
4. At the end of the day, the knowledge we learn from the vast data and learning analytics should reflect the learning objectives of the programmes we offer. Success of each cohort can be monitored and if the problem is deemed to be from the syllabus itself, then reviews of programme and changes can be made, and so on...
5. Like it or not, we can't run away from evaluation and assessment of the analytics tool itself. Knowledge audit on what's learnt should be looked into.
These are randomly poured from this tired mind. Timecheck it's midnite here in Kuala Lumpur.
- Shazz, Kuala Lumpur
Is there anything like an auto-suggest of books to read if the analytics find that a student is left behind in catching up on certain topics?
Or maybe some 'alert' function that can predict on certain 'inactivity' of some learners after more than a week or so?
I know data mining and prediction analytics can somehow do that.
- Shazz, Kuala Lumpur
Here are the components that I think should be part of a Data Strategy:
What problem are you trying to solve?
Who owns the problem (what stakeholders have the most vested interest in the solution?)
How much money, time and resources are the owners of the problem willing to invest in the solution?
What would a successful implementation of Learning Analytics look like in relation to the problem?
Where is the data generated? Is it inherent in the natural activity of people, or does it need to be generated by new activities?
What needs to be set up ahead of time to make the needed data available at the back end? How do those steps need to be integrated with the development plan of the program?
What tools/methodologies will be used to collect/store/retrieve and analyze the data?
Who is the target audience for the analysis? What format do they need to be able to access the analysis?
As for who should be involved, I would think anyone who is capable of doing the above. Most importantly someone with an in-depth understanding of the desired outcomes and the ability to communicate that understanding to the other stakeholders.