Weirding Word®
There is this ridiculous intersection in Wiliamsburg, VA. As soon as westbound traffic passes through, the cars must then choreograph a merge in a very short space of time -- literally no more than about 20 feet and 3 seconds Now, merging is just a pain under normal circumstances, but this merge is a left-to-right maneuver; not the standard right-to-left. Yes, folks, the left lane (the side with the oncoming traffic) disappears about 3 seconds after you cross the intersection. Not only do you need to find a way to play nice with the folks in the other lane, you've got to make sure you don't stray over the double-yellows into the grill of some guy's F-150. And, truly, this is how it is merging in professional life. Are the right people at the table? Putting a damper on your euphoria? Gaea Honeycutt blog at WeirdingWord.com Weirding Word®, a division of G.L.
Program evaluation
The process of evaluation is considered to be a relatively recent phenomenon. However, planned social evaluation has been documented as dating as far back as 2200 BC.[4] Evaluation became particularly relevant in the U.S. in the 1960s during the period of the Great Society social programs associated with the Kennedy and Johnson administrations.[5][6] Extraordinary sums were invested in social programs, but the impacts of these investments were largely unknown. Program evaluations can involve both quantitative and qualitative methods of social research. People who do program evaluation come from many different backgrounds, such as sociology, psychology, economics, and social work. Some graduate schools also have specific training programs for program evaluation. Doing an evaluation[edit] Program evaluation may be conducted at several stages during a program's lifetime. Assessing needs[edit] There are four steps in conducting a needs assessment:[9] Assessing program theory[edit] Validity[edit]
Training Evaluation Blog
22 January 2013 Written by Dave Basarab Frequently I share best practices in the training and development field. Those of you have a read my posts previously know that I believe in a Learning to Performance approach . Here is a recent success in using this approach with one of my clients. My client and their business issue. The solution. Step 1 – The Impact Map. Step 2 – Redesigned Work Processes & Tools . We also agreed to use an on-line project management collaboration tool called Smartsheet as the software for all project teams. Step 3 – Design & Development . Step 4 – Management Prep Team Session . The project management workshop Project management principles & processes How to make project management work and be an executive sponsor Meeting with your employee before and after the workshop Providing post-workshop support After this session, I met with the Steering Committee to talk about next steps. Step 6 – Organizational Support by the Management Team. (1) Source: The Standish Group
Basic Guide to Outcomes-Based Evaluation for Nonprofit Organizations with Very Limited Resources
© Copyright Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Adapted from the Field Guide to Nonprofit Program Design, Marketing and Evaluation. Description This document provides guidance toward basic planning and implementation of an outcomes-based evaluation process (also called outcomes evaluation) in nonprofit organizations. This document provides basic guidance -- particularly to small nonprofits with very limited resources. NOTE: This free, basic, online guide makes occasional references to certain pages in the United Way of America's book, Measuring Program Outcomes: A Practical Approach (1996). NOTE: Outcomes-based evaluation is but one type of evaluation -- there are many types of evaluations. Table of Contents Reasons for Priority on Outcomes-Based EvaluationBasic Principles for Small Nonprofits to Remember Before Starting Outcomes PlanningWhat is Outcomes-Based Evaluation? Also see Related Library Topics Also See the Library's Blogs Related to Outcomes Evaluations No! No!
The Challenge of Organizational Learning (July 13, 2011)
Disseminating insights and know-how across any organization is critical to improving performance, but nonprofits struggle to implement organizational learning and make it a priority. A recent study found three common barriers to knowledge sharing across nonprofits and their networks, as well as ways and means to overcome them. Illustration by Anna & Elena Balbusso. Reinventing the wheel—this well-worn phrase describes one of the oldest of human follies: undertaking a project or activity without tapping into the knowledge that already exists within a culture or community. Consider the views of Kim Oakes, director of sharing and communities of practice at the Knowledge Is Power Program (KIPP), a national network of 99 charter schools serving 27,000 students via 1,900 teachers. Or consider World Vision, an international Christian development organization with an annual budget of more than $2 billion operating in 93 countries. Authors ranging from the late business historian Alfred D.
Evaluation home
All faculty and staff of the University of Wisconsin-Extension Cooperative Extension are responsible for quality educational programs that produce positive and equitable results. To support a learning organization and to ensure the wise investment of resources, evaluation is part of each person’s job responsibilities. Written evaluation plans are developed as part of the program planning process. In recent years, the organization has moved to an outcome-based planning and reporting system with evaluation playing a critical role in learning, program improvement and accountability. PDE is charged with providing leadership and capacity building in program evaluation through in-service education, resource material development, consultation, administrative support and managing high priority evaluations. Recipient of Distinguished Evaluator Award
Measurement Tools for Evaluating Out-of-School Time Programs: An Evaluation Resource / Publications & Resources / Out-of-School Time
Introduction A growing investment in evaluation, for purposes ranging from continuous improvement to accountability, has led to increased requests from the out-of-school time (OST) community for practical evaluation tools. As part of Harvard Family Research Project’s continuing effort to help practitioners and evaluators choose appropriate evaluation methods, this guide describes a select set of instruments and tools that can be obtained and used for on-the-ground program evaluation. Whether you are conducting first-time internal evaluations or large-scale national studies, these evaluation instruments can be used to assess the characteristics and outcomes of your programs, staff, and participants, and to collect other key information. The information in this guide can help practitioners and evaluators find evaluation instruments that match their program and evaluation goals and characteristics. The evaluation instruments in this guide are presented in tables organized by content area.