Monday, August 30, 2010

Adult Education FAQS - Adult Edu Jobs

Adult Education FAQS

Came across this very informative article on what prospects are out there for Adult Educators in the marketplace today.

Click the above title link.

Adult Education FAQS

Adult Education FAQS

Having worked in the Health sector for a decade, and more specifically in the area of adult education, I've come to realize the vast applications possible from both the health client perspective and health practitioner perspective. My particular focus has been in e-learning - transferring knowledge from the haves to the have-nots from an access medium in professional practice.
We see access to adult eduction in the health sector as seamless at times, where information is passed on both informally and formally, however there are parts of the population, (health workers) who still do not have access to the most current knowledge and application. E-learning has come a long way in my experience over the years to assist in addressing this gap.
For those working in rural and remote areas, education comes to them, versus having to leave their communities, travel far distances, incur great costs to receive information.
Of course, Adult Learners learn in various styles and e-learning may not be the ideal for some who are kineastetic, spatial, musical, or tactile learners. The hands-on application of learning, may need to be built in as part of the program transition in the community.
I therefore suggest that a blended learning approach to program planning be adopted for these learners to lessen the barriers perceived.
E-learning itself has evolved to accommodate more learning styles over the years with the incorporation of multimedia and social media sources. No longer is it a static textbook on screen with a testing component, but a more diverse display of the learning materials through text, audio, video and network interaction either in real-time or post-time. Moreover, adult learning can be a more rich experience over the internet perhaps even than in the classroom given the program planning for diverse learning, access to technology and bandwidth availability.

Thursday, August 12, 2010

PPE Reflections

As my course in program planning and evaluation is coming to a close and I'm wrapping up the final assignment - a program and evaluation plan for 'creating workplace sustainability' - I've had the opportunity to reflect on some key areas that really engaged me and have me looking forward to implementation in my professional practice.

Graphic Models - i.e Flowcharting:  This allows for a bigger picture overview with out too much detail.  I found that this was very helpful in illustrating exactly how key elements of the program plan are connected and how it all comes together towards the program goal. Here is an example of ours:

Logic Models - yes, another model (can you tell I scored higher as a visual learner):  This model helped me to understand what ingredients need to go into baking the pie and what feeling I anticipate people having after eating the pie.  So essentially Inputs, Outputs, and Outcomes along with Assumptions and Intervening Factors - all of these elements were essential in making sure key considerations in program plan were covered.  Here is our example:


Learning Styles & Multiple Intelligences - when creating learning objectives/outcomes:  I realized the importance of being aware of the various types of adult learning styles that may be present in any give program execution.  Going through the motions of trying to build a program that included activities for a variety of styles and intelligences turned out to be a more challenging task than I anticipated but a very worthwhile one in order to achieve the desired learning outcomes.

Experimental Design - planning the Evaluation piece of the plan:  Again, building this process allowed me to have an overview of how the program could be evaluation - when to do the evaluations and how (what tools) was key to the Experimental Design.  For example in the plan I was building we ended up with the following (pre, post, post, post):
Now, having the funding and senior team  or sponsor approval for this is the work that still has to be done, but in an ideal world - these are the points where data collection would be needed to give the best idea of program success and guidance for future planning.

Data Collection & Analysis - in general:  this is a HUGE area of learning for me, and will be an on-going study as there is such a variety of tools and techniques to use and deciphering what is appropriate to each program plan is the challenge - a good challenge but definitely requires more study on my part.

The Program and Evaluation Plan - pulling it all together:  the way the course was structured, where each element of the plan came together as we went along to a culmination of the pieces worked really well for me in terms of really seeing how the plan is built and learning how to incorporate each piece along the way.  This type of instructional design is something I really appreciated and made the final program plan much more fine tuned by the end for handing in.

So, this was a relevant course to my professional practice as a project manager in e-learning.  I look forward to building on the knowledge gained and putting it to work in my next project plan!

Becoming an Interview Expert

The experience of expanding my expertise in an area of program planning and evaluation, seemed daunting at first thought.  How could I dig deep enough in to my chosen topic of interviews and observation methods of data collection tools in just a matter of weeks?

The truth of the matter is that I'm not even close to being an expert, but I have gained a whole lot more insight through the exercise.  I chose to focus increasing my knowledge on this particular area in program evaluation because in my role as a Project Manager, I've often spent most of my time upfront planning the project, scope, charter, quality management, etc. However, the program evaluation, typically fell tot he wayside primarily because it wasn't planned for up front with its own set of deliverables/outcomes to meet.

I certainly needed to learn more about conducting a program evaluation and more specifically:  data collection tools - how to gather the information needed to measure success in a program or to meet a particular hypothesis.   I honed in one particular method, Interviewing.

Perhaps the most time-consuming, costly and hardest to record, it is all the method that could produce the richest, most detailed and clear information.   Knowing that there are structured and semi-structured interviews. Interviews that can take place in person, face-to-face, over the phone and even over the internet with chat and video conferencing applications!  So, as you can see....the method of interviewing for data collection is an exciting one...and this is just the tip of the iceberg!

Observation as another method of data collection is also quite exciting I found, but that will require another entry here.  Checklists, scripted description, anecdotal description or unstructed observation - again another rich medium of gathering insightful data!

Saturday, August 7, 2010

Evaluation, Stats & Pizza

The "Pizza Exercise."  This was an exercise undertaken in my PPE class to aid in understanding program evaluation stats and analysis...what are the different kinds and what can they used for: Paired-t and student-t tests, mean, mode, medium, etc.
Which pizza is best?
Now if only this exercise included real pizza, that would have made it a whole lot more fun!  Nonetheless, it was still effective in demonstrating data collection and translation into useful statistics for reporting.  Comparing two pizzas with with the same topping, "pizza A" and "pizza B," I learned that degrees of freedom  for the tests means (n-1) sample size and that this is a standard sample size number calculation.  Another thing, that I gleaned from this exercise was the importance of noting variables for experiments.  For example, if participants ate two slices of pizza A and then two slices of pizza B, eating the first two slices may influence how much of pizza B they eat because they could be full after the first set.  Or other influencing factors could be time of day of consumption, people who don't eat pork, how hungry people are, etc.

 I can't say that I feel exactly confident in using stats and data analysis in program evaluation, but I don't feel so left in the dark anymore.  This is certainly something I will need to dig a little deeper into for my own projects at work and for programs that will be designing.

Thursday, August 5, 2010

Triad on Data Collection

As part of our PPE class, we embarked on a project to plan a program for the real world.  My partner and I are currently planning a program called "Creating Workplace Sustainability" for KPMG, where she works.  The initial idea for this program came from the "Going Green Campaign" at Saint Elizabeth.Health Care, where I was the project manager.

Similar to getting a project started, a program requires similar elements such at, approval of budget, concept, and resources from a sponsor and commitment to the life of th program.  Essentially, this program has 6 modules/topics which office staff choose to register in.  If staff take all six, they get a sustainability certificate and could be eligible to become a program leader.

 The area that poses the greatest learning opportunity for me is program evaluation. Our experimental design consists of data program data data data.  We ago to test the idea of this design out in class by doing "triads."
We broke into groups of three, where each person gets 15 min to introduce his design tool (s) for feedback.

This exercise allowed objectional feedback to be given on each experimental design and tool.  This is definitely an exercise that I will note and perhaps try again as a program activity.