Effective digital records management, data management and document processing

There is a rapidly increasing volume of information that exists in digital form. Considerable opportunities are offered by digital technology to provide rapid and efficient access to information, but there is a very real threat that digital materials will be created in such a way that not even their short-term viability can be assured, much less the prospect that future generations will also have access to them.

  1. Managing Electronic Records
    The purpose of this course is to provide participants with guidance on the responsible management of electronic records that align with and support the Institutional Records Policy. These guidelines, in accordance with other established policies and procedures, apply to all electronic records, regardless of their digital form, created or received by an office or department in the transaction of its proper business or in pursuance of its legal obligations. A number of new technologies to create electronic records, and the record creation can be both active (e.g., adding data to a database) or passive (e.g., automated logging of system updates). Individual records may be created in electronic mail systems, as web-based publications, and as documents created and stored in administrative information systems.
  2. Document processing – About 80% of information is unstructured, yet this data is crucial for many purposes. Processing these documents into a digital format, and properly categorizing the information for use provides many benefits that can increase an organization’s competitiveness, improve services offered to the public and provide more significant insights into various programs and activities.

This document processing course will enhance:

  • Cost optimization. Retrieving information from unstructured documents is costly. Intelligent
    document processing with categorization makes more information available to organizations so that they can operate with greater accuracy.
  • Error prevention. Document processing automation ensures consistency and accuracy, so the information you extract is more useful.
  • Increased productivity. Automated document processing is faster than doing the analysis manually, and it is well-recognized that speed is crucial in any data migration effort.
  • Optimized data integration. Document processing can work over various platforms.

By creating a document workflow automation this course will enable:

  • Minimize risks that arise from human error
  • Make documents easier to access
  • Improve visibility and accountability
  • Remove productivity bottlenecks

Excel for Data Exploration

This course is an accelerated introduction to a variety of research and analysis-related skills in the Microsoft Excel environment. This course is designed to familiarize participants with core skills in data access, manipulation, analysis, and presentation using Excel (and Excel-like alternatives). Spreadsheet programs remain an important part of professional life, and this course will prepare participants for basic and intermediate use of aspects of the program in that setting.
This course covers data import, formatting, and management.

Analysis topics include: producing tables (including summarization), visualizations, and varieties of analysis. No prior
knowledge of Excel is expected, but familiarity with basic concepts of computer use and
statistics will be desirable.

Using Social Network Analysis to Identify Decision-Makers and Improve Policy and Security

While significant investment has been made to accomplish important policy outcomes, much more could have been achieved if it was not for serious inefficiencies and leakages as well as a pro-rich bias in the design and implementation of programs. These failings, as well as others, are not just a problem in the formal structures of power or the technical design of policies. They have a lot to do with who the key actors are, how they relate to each other, and how funds, information, and ideas flow. In practice the networks of relationships that determine how policy decisions are made and implemented are often very different from that spelled out in constitutions and laws. And those networks can be decisive. With this dynamic in mind, this course will empower in the use of Social Network Analysis (SNA) to identify who, when and how social policy comes into being with the aim of improving policy targeting and effectiveness. The course will empower participants to analyse the characteristics, motivations and interactions of key players which helps to identity sources of inefficiencies.

By highlighting the social and institutional relationships that lead to decision-making in public systems and in small versus large administrative units the course sheds light on areas where corrections might be made in response to various challenges. Case studies
will be used to reveal how a focus on the real actors behind social policy can improve
coordination between different governmental and non-governmental bodies, boost the delivery of vital services, and mitigate inequalities in the context of crime, health and economic crisis in which budgets are either stagnant or dropping during moments of crisis.

Many actors are involved in policymaking including public opinion leaders, officials at different government levels, unions, business associations, non-governmental
organizations (NGOs) and social movements. The complex interaction between these different players and the extent to which they are well coordinated helps determine whether decision-making is strongly institutionalized, stable, and has long-term horizons;
whether, in essence, it is effective. Where too many actors intervene and power is not hierarchical and centralized, social policy actors may fail to communicate and collaborate with each other, leading to poorly coordinated and inefficient policies. All this needs to be
considered in the search for more stable, coherent, and coordinated decision-making.

The Social Network Analysis (SNA) course will empower participants in the discovery of many aspects of the policymaking process that usually remain hidden with the use of more traditional analytical tools. We will introduce the use of SNA as an in-depth tool to shed light on where improvements can be made and how organizations at all levels might adjust to better serve their citizens.

Project risk analysis

Project risk analysis or risk management is the process of identification, analysis and response to any risk that occurs during the life cycle of a project. Some examples of categories of potential risks include: technology, costs, timing, clients, contracts, financial situation, political situation, environmental situation, and persons. Analyzing the
risks that may lie behind the execution of a project, predicting the possible obstacles and having a vision of the solutions in advance is certainly vital for any project. Risk management cannot and must not be just an action in response to something. It should itself be part of the project planning process, in its evaluation phase. Risk analysis is a crucial input for objective risk management.

Risk analysis is the systematic use of available information to determine how often specified events may occur and the magnitude of their consequences. However, the process of risk analysis can also uncover potential positive outcomes. By exploring the full space of
possible outcomes for a given situation, a good risk analysis can both identify pitfalls and uncover new opportunities.

Risk analysis can be performed qualitatively or quantitatively. Qualitative risk analysis generally involves assessing a situation by instinct or “gut feel,” and is characterized by statements like, “That seems too risky” or “We’ll probably get a good return on this.” Quantitative risk analysis attempts to assign numeric values to risks, either by using empirical data or by quantifying qualitative assessments. This course will focus on qualitative but more on quantitative risk analysis.  A quantitative risk analysis can be performed in a couple of different ways. One way uses
single-point estimates or is deterministic in nature. Using this method, an analyst may assign values for discrete scenarios to see what the outcome might be in each. For example, in a financial model, an analyst commonly examines three different outcomes: worst case, best case, and most likely case. This however considers only a few discrete outcomes, ignoring hundreds or thousands of others. It gives equal weight to each
outcome. That is, no attempt is made to assess the likelihood of each outcome.

Interdependence between inputs, the impact of different inputs relative to the outcome, and other nuances are ignored, oversimplifying the model and reducing its accuracy.

Yet despite its drawbacks and inaccuracies, many organizations operate using this type of analysis. A better way to perform quantitative risk analysis is by using Monte Carlo simulation. In Monte Carlo simulation, uncertain inputs in a model are represented using
ranges of possible values known as probability distributions. By using probability distributions, variables can have different probabilities of different outcomes occurring.

Monte Carlo simulation provides a number of advantages over deterministic, or “single-
point estimate” analysis:

  • Probabilistic Results.-Results show not only what could happen, but how likely each outcome is.
  • Graphical Results.
    Because of the data, a Monte Carlo simulation generates, it’s easy to create graphs of different outcomes and their chances of occurrence. This is important for communicating findings to other stakeholders.
  • Sensitivity Analysis.
    With just a few cases, the deterministic analysis makes it difficult to see which variables impact the risk the most. In Monte Carlo simulation, it’s easy to see which factors had the biggest effect on the risk. This will greatly assist in policy and decision-making during risk management.
  • Scenario Analysis.
    In deterministic models, it’s very difficult to model different combinations of values for different inputs to see the effects of truly different scenarios. Using Monte Carlo simulation, analysts can see exactly which inputs had which values together when certain outcomes occurred. This is invaluable for weighing different policy
    and mitigation alternatives and prioritizing the most optimal single or combination of solutions within available resources.
  • Correlation of Inputs.
    In Monte Carlo simulation, it’s possible to model interdependent relationships between input factors affecting risk. It’s important for accuracy to represent how, in reality, when some factors go up, others go up or down accordingly to assess which factors would be an objective target for risk management efforts.

This course will use Excel as a platform for performing quantitative risk analysis in the spreadsheet model. Many risk analysts still unnecessarily use deterministic risk analysis in spreadsheet models when they could easily add Monte Carlo simulation. New functions will be added to Excel for defining distributions and analyzing output results. Basic knowledge of excel will be desirable although an introduction will be given.