EduBrite 2016: A year in review

It was a great year for EduBrite. Many incredible things happened in 2016. Thanks for being part in this journey with us. 

We introduced Open Learning, a learner centric micro learning solution in 2016. EduBrite is one of the first one to offer this solution in the LMS industry. Open learning helps creating an engaging learning environment for your employees as well as external audience – customers, partners etc. You can setup a public facing Open Learning solution and leverage it to serve as a great tool for content marketing.

Users can create their own play list, follow, rate, comment. They can also social share the content with their peer group and claim mastery point in the subject area they are expert in. 

Open Learning is also a gateway for progressive learning. It directs users to register in course(s) to learn their topic of interest formally. So essentially you can offer an integrated experience to your learners.

In addition to Open Learning, we have added many new features, enhanced existing features, improved usability and added automation with key goals in mind to continue to improve the experience of our customers and their users. Here is a quick snapshot:

Gamification – You can create badges easily by uploading external images or use from the library. Enhanced leaderboard is available.
Instructor led training – New integration with Zoom.us. New functions e.g. download sign-in sheet, manually generate invoices, edit amount, add discount, take notes, easy options to mark attendance and awarding completion.
Collaboration – Forum now offers features such as forum priority, assigning, sub forums etc. to provide light weight task/issue tracking capabilities.
Custom properties – User defined custom properties now can be added at various places e.g.  Group, Course, Course Session, Program, Event, Course Member, Program Member and Event Participants. This allows you to extend the LMS objects in a very flexible ways to capture your business specific information. 
Authoring – Many enhancements are added in test and course throughout the year. 
New Role – A new role group coordinator is added to offer additional flexibility in decentralizing the access and management of training operations. 
Reporting – Enhanced many reports e.g. learners report, program report, question statistics report etc. Added a new transcript summary report.
Confluence Integration – In addition to many new macros e.g. customizing dashboard, reports, leaderboard etc. now you can auto sync users and groups.


We make it easy for you

We make your experience easy by offering our integration with Atlassian Confluence, Salesforce, Yammer, Google etc. This allows users to access EduBrite LMS right from the Apps where they spend most of their time. Additionally, we have integrated with many wonderful applications so that you use best of breed applications and provide a great experience to your learners.

final.jpg

We have a pretty exciting 2017 ahead of us! One of our primary focuses will be to offer an even more engaging learning experience, so we have planned for several enhancements in Open Learning and Gamification. We understand that you want to offer ubiquitous learning, so expect more developments in our native iOS App and our recently released Android App.

 

Thanks for an amazing 2016. Happy Learning in 2017!

Advertisements

Top Down & Bottom Up LMS implementation approaches for employee training

When it comes to LMS implementation in any organization for internal training, most common approach is top down. Top down approach is heavily based on maintaining controls at the top level and selectively giving some rights to lower levels in the training delivery hierarchy (which is generally same as org structure).

top-down

This approach requires lot of upfront planning, identifying stakeholders across the organization and resolving internal conflicts within different teams to create common ground for a centralized LMS. On many occasions, this whole process takes several months to a year before LMS can be fully rolled out to employees, thus resulting in huge upfront cost. Even after rollout, the top down control of rights creates an ongoing battle of control and sometime conflict of interest among different groups who needs to use the LMS. All these factors limit the adoption, and returns on the investment.

If we look deeper into why this happens so frequently, we find the biggest cause is a common issue with many LMS products; most of them are built with assumption of global role based controls. In most systems, only admin or instructor can do activities like course creation and setting up its delivery. These rights are not available for individual team or group level, resulting in big process bottlenecks. Course content has to be managed by a few selected people (L&D managers or Curriculum developers), for a very large number of teams. Team leads do not get the ability to quickly create their own training courses and deliver them to their teams. This also makes the job very tough for the L&D managers who typically are at the top of the Training Delivery hierarchy. They must find time to create & facilitate training courses and provision them for entire organization. This process takes too much time to plan and becomes difficult to execute. As a result, not all departments get equal attention and become indifferent to LMS or find their own (adhoc) solutions. A clear symptom of this effect we see is – existence of multiple LMS in the same organization, which is owned by different teams. Although this gives full control of the LMS these teams own, but as a whole the organization doesn’t get a clean & coordinated learning environment, and the cost becomes too high.

Top down approach has its merits in many kinds of training such as compliance, that requires full control, but there are several other kinds of training usage (especially informal) where it becomes a limiting factor.

EduBrite offers a clean solution to this problem by allowing possibility of a Bottom Up implementation strategy. We discussed this topic in our webinar last week, recording of which can be accessed here – http://bit.ly/1mSodHO

bottom-up1

Bottom up approach works by allowing everyone to create training content (courses) and managing its delivery. LMS implementation can be done at rapid speed with just a potential announcement of its availability and may be a “getting started” video, and allowing different team leads to start using it for their teams. No setup of org hierarchy / departments / roles etc. is needed. The team, who need the LMS most, can be the first to adopt, and lead the way for other teams to follow. Leaders or experts can join hands and create community groups (super groups) as well by merging or sharing resources from their own group and evolve onto a organically grown Group structure for training/learning activities.

bottom-up2

Regardless how large the overall organization is, team level implementation seems very simple and quick. L&D managers can still be overall admin for the system and can visualize the system usage, and other analytics about the adoption. They can even create a healthy competition among teams to make best use of the LMS. From the cost perspective also, you can get high return on investment by not buying a large number of seats for the LMS upfront; rather follow the scaling model based on the demand growth.

Since Bottom up approach is based on participation by teams, it becomes a more stable and likely more successful implementation, compared to top down model. If LMS permits (like EduBrite does) you can also have mixed implementation approach in the same system.

mixed

Key product features that allow implementation of bottom up strategy are –

  • Allows training content creation, ownership and provisioning rights to all users, so they can develop and deliver trainings for their teams
  • Allows Group level roles, sharing content across groups
  • Allows multiple group membership by the same user, so they can play different roles in different groups
  • Makes it easy to evolve the group hierarchy and allows possibility of multiple alternate hierarchies to co-exist in the same LMS
  • Allows re-use of training materials to create different variations or courses and programs by re-packaging it to make it suitable for different groups
  • Hierarchical group based permissions for administration, data visibility and reporting

Best implementation strategy heavily depends on the specific usage and may be different in each organization, but having familiarity with the options and availability of the right features in the LMS can give you full flexibility.

 

SCORM Quiz – Item Analysis issues & solution

SCORM is widely used in the eLearning community so I will not get into what it is, rather I will get straight into the fundamental issue it presents for a Learning Management System (LMS) from Quiz reporting perspective. This is based on my first hand experience while developing EduBrite LMS and having seen a variety of SCORM content thru several customers.

Most LMSs (including EduBrite) have some kind of built in Quiz creation feature. (We are focusing the discussion only on LMSs that provide quiz-authoring capabilities). As a eLearning content developer you have option to use the built in Quiz feature or embed the Quiz questions inside a SCORM package that you can create using authoring tools (like Storyline or Captivate). You can even hand code a SCORM if you are taking deep dive into 700+ pages specification and have reasonable experience with Javascript.

In this article I will discuss the implications of your choice, from the reporting perspective between SCORM based quizzes vs natively created quiz in LMS. This will also help in setting the right reporting expectation from LMS, an eLearning developer can have.

Generally, for the Quizzes created in LMS, we have seen far superior and usable reporting but for SCORM based quizzes, the reporting doesn’t go that far or isn’t that usable especially from the non-technical user’s perspective. And it often leads to dissatisfaction among the LMS customers, because they expect LMS to provide same usable reports, regardless of whether they are using SCORM or using built in quiz in LMS.

At EduBrite we created a mechanism based on data mining to provide same reporting for SCORM quizzes as what is available for quizzes directly built in LMS. But this feature is experimental and isn’t full proof yet to cover all scenarios, especially considering wide variety of authoring tools and few areas where SCORM specification leaves things open to implementations.

In this article I will first describe technical challenge in reporting for SCORM based quizzes, and that would explain the differences and limitation you will find when you use then in any LMS. I will also explain how EduBrite tried to solve it (although not with full perfection), and (few) shortcomings in our solution.

To set the context for remainder of this article, lets look at an example of a very commonly seen multiple-choice quiz question.

Question

What is 10+2

Choices

  • 10
  • 11
  • 12
  • 13

 

Design time

First let’s look at the design time (authoring time) difference from the data awareness point, and by design time, I mean until this question is attempted by a user. When you create the quiz/question in LMS, it knows everything about it, like question id (internally assigned by LMS), question type, question statement, choices, correct answer. But when you create the same question in SCORM, and upload the package (zip) in LMS, LMS knows nothing about this question. What you packaged inside the SCORM zip file is completely opaque to the LMS, except for the manifest, which only describes SCO you have inside the package.

Runtime

Let’s look at what data points LMS can get in both cases, when a student attempts the question.

A. LMS Quiz

When you use built in authoring of LMS, it is able to capture student’s answers to this question and link it to the already known question id in the LMS.

Consider that student picked up a correct answer 12 (3rd choice). LMS would immediately know that out of the four available choices, user has picked 3rd choice which was correct, when was the question attempted, how much time the user spent on the question and what should be the score for this attempt.

If the above question is attempted multiple times, by multiple students, LMS can provide an Item Analysis report about difficulty level of the question, e.g.

item-analysis

LMS can also provide a report to show student’s attempt and full context of the answers they selected.

question-result

B. SCORM Quiz

Now if we were using SCORM, lets see how the situation changes. When student submits the response to the question, SCORM will send a set of data elements known as interactions in specification. For example SCORM might send something like this to the LMS –

cmi.interactions.0.id – Q1
cmi.interactions.0.type - choice
cmi.interactions.n.learner_response - 12
cmi.interactions.n.correct_responses._count - 1
cmi.interactions.n.correct_responses.0.pattern - 12
cmi.interactions.n.result – correct (we have seen variations like correct/incorrect or 1/0 in content produced in different authoring tools)
cmi.interactions.n.weighting – 1 (commonly interpreted as score or relative score w.r.t. total score)
cmi.interactions.n.timestamp – 114-01-04T21:23:37 (interaction time)
cmi.interactions.2.latency - PT00H00M02S (time spent on this interaction)

So upon receiving this data set, LMS becomes aware of this question for the first time in its lifecycle. It knows the question ID, the type of question, what was student’s response, what is the correct response, whether the student’s response was correct or incorrect, score, time of the attempt and time spent on the attempt.

Important things that LMS doesn’t know yet, which was available when question was built in LMS are –

  • What exactly was the question (statement)?
  • How many choices were there in the question, or what other choices were available to pick from, that may be correct or incorrect

 

We can address first of the above two points by using SCORM 2004 (if LMS also supports it). In SCORM 2004, new data element “description” was introduced for interactions. You can send following new element about the interaction to the LMS.

 cmi.interactions.0.description = What is 10+2

With this new element, LMS can report what was the question, and what was student’s answer, and whether it was correct or incorrect. But it still doesn’t know about the other available choices (other three that are incorrect but are not picked by the student).

Solution

As a LMS provider, here is how we tried to tackle this issue, and provide full report similar to questions created in LMS.

If large number of students attempts the above question in SCORM over a sufficiently long period, statistically at some point some student will pick each available choice (probability 1/4). And if LMS could correlate several interactions record to correspond to same question ID, it can learn about all other incorrect choices, or keep learning more possible incorrect choices with time.

E.g. when a student picks the first (incorrect) choice, LMS will see following data elements:

cmi.interactions.0.id – Q1
cmi.interactions.n.learner_response – 10
cmi.interactions.n.result – incorrect
cmi.interactions.n.weighting – 0

And assuming LMS has seen question id Q1 before, it can check whether it has also seen the answer 10 before or not. If not, it can add 10, to the other available choice for the same question. And it also knows that this is the incorrect answer.

Similarly when LMS sees another incorrect answer

cmi.interactions.n.learner_response – 11

it would learn that there is another incorrect option available for the same question. Eventually LMS will learn about the fourth (all) option when it sees

cmi.interactions.n.learner_response – 13

By correlating all the above interactions to same question, it can fully re-engineer how the question looks like. And now, it can provide same kind of report, as the quiz created natively in LMS. It can also show the question context when showing details of a student’s attempt.

But in order to accomplish this correlation, LMS should be able to unambiguously match question ids among several interactions (from several students) that are reported to it. The first thing that is needed is to only consider the interactions reported by the same SCORM package. And this is where the ID of the SCORM package as mentioned in the manifest can be used, along with the internal id that LMS may have assigned to the uploaded SCORM.

So it appears that we do have a solution that can give same (full) reports for the quiz question (interactions) embedded within SCORM. Nice. But there are few cases where we need to be cautious.

1. Multiple Attempts (interactions)

Multiple interactions on the same question (or re-attempts) provide an interesting case. We noticed that different authoring tools (or elearning developers) have different ways to represent the interactions ids.

Some re-use the same question id (effectively overwriting the previously stored answer) following a technique referred as Sate, while some other tools add an attempt count suffix to the question id, for each unique interaction. E.g. Q1_1, Q1_2 etc, referred as Journaling. (ref – Tim Martin http://scorm.com/blog/2010/11/4-things-every-scorm-test-should-do-when-reporting-interactions/). Although we found inconsistencies among tools in how they generate Ids even when using Journaling to not overwrite answers from previous attempts.

This presents a potential problem while reverse engineering; because LMS can’t cleanly (or consistently) correlate these interactions to the same question ID and might interpret each attempt of the same question as a new question. This effectively limits the accuracy of the item analysis because same question may be reported (or interpreted) as different question depending on the attempt (first attempt, second attempt).

Based on our analysis of several packages from several authoring tools (like created in Storyline, Captivate and few others), we have devised a pattern-based logic to derive the question id and attempt numbers accurately. But this may not be fully accurate in handling all authoring tools and ID naming conventions.

2. SCORM ID in Manifest

If content developer changes the SCORM content (questions and/or choices) but keeps the same ID in the manifest and replaces it for the existing uploaded package in LMS, the reporting can completely go out of sync. Because LMS would incorrectly correlate unrelated questions because they will be assumed to be part of same SCORM due to same ID in the manifest. This can be avoided easily by using new ID in the manifest (unless the changes are minor).

3. Randomization

If the SCORM has internal logic to randomize the questions, but it doesn’t sends the consistent interaction IDs regardless of the position (sequence), then the reporting becomes inconsistent. eLarning developers can also solve this by using IDs in consistent manner.

4. Multiple correct answers

We have noticed inconsistent behavior in how SCORM tools report the correct_responses and learner_response. Some tools embed choice identifier (like a, b, c etc) in the response, while others don’t. Similarly when there are multiple answers some use comma delimited while others use space, tab or other conventions. This is one of the open problems we are working on and based on known conventions of many tools we can solve it to some extent.

5. Probability

We assume that statistically all choices will be picked up at least ones, but practically there is no finite time-frame in which it will happen. So when you are looking at reports, you might find an incomplete list of choices for a question in LMS.

All the above problems can be avoided during SCORM content development, by having a little more closer attention to the IDs and having a perspective that what runtime data SCORM sends to LMS can be used for further correlation and analysis.

Higher Education – Competition at doorstep : Macro Impacts

Foreign Institutes are knocking at our doorsteps. The Foreign Educational Institutions Bill was approved by the India’s cabinet in Mar-10. It is pending for Parliament approval. Observers expect that the bill will be passed by the parliament as is or with minor changes.

India sends around 100,000 students per year to US for higher education. Per V. Rangaraj President of Indo-American Society. “There is another almost four times the number of students who want to study in the US. Thus, there lies a huge opportunity for US educational institutions to access these students by bringing their brand of education into India”.

On the heels of this bill, Indo-American Society, is celebrating the golden jubilee with focus on higher education. This event will be inaugurated by Union Human Resource Development Minister Kapil Sibal on July 30. The event is expected to create collaborations between educational institutions, and exchange programs involving students, faculty and researchers from both India and the US. Prominent personalities from US education sector are participating in this event. Looking around the news and blogs, one thing is highly visible that administrators and chancellors of renowned universities of US are visiting India to meet with government officials, local universities to explore opportunities.

Macro Impacts
Based on the information available so far, it is expected that 150 – 200 foreign institutions are looking foward to put up their brick and mortar footprints in India. The new dynamics will create a macro impact in the education sector – governing bodies, governing laws (reservations etc.), overall market, education service provider (Institutions) and education receiver (Students). In this blog, I am going to discuss the first three. Other two will be discussed in part 2.

Governing Bodies
Kapil and his Human Resources Development (HRD) team has called to abolish existing governing bodies like University Grants Commission (UGC), All India Council for Technical Education (AICTE) etc. Professor Yash Pal’s report recommends the same and he is vocal about this. It is supposedly replaced by the National Commission for Higher Education and Research (NCHER), which is recommended as a core and central body to govern all the disciplines including Medical Council of India, Dental Council, and Pharmacy Council. Negotiations between HRD and ministry of Law and Health underway to bring all governing bodies under NCHER.

While Ministry HRD (MHRD) is rallying hard behind one single exam for all India entrances, it will be interesting to see how this process will be impacted because of the new entrants. Like US entrance exams SAT/ACT, GRE/GMAT, India will have single central exam (at least for non state governed colleges), which will be governed by NCHER. Institutes will be more objective while giving the admission (refer to my last blog – https://blog.edubrite.com/2010/06/27/) to students. Arrival of foreign institutes in the market may act as a catalyst.

Governing laws
According to this bill Quota laws will not be applicable to foreign institutions. The reaction of the groups impacted is yet to be seen.

Overall Market
There are two categories of students which will have the wider option of going for advance studies. First, those who wants to go abroad for further studies but don’t get opportunities. Second. those who doesn’t consider higher education as an option. I highly doubt that there will be any decrease in the cash outflow, which Ministry of Human Resources Development (MHRD) is expecting due to brain-drain of Indian students worldwide. In my opinion some of the key reasons students go abroad are – (i) True international exposure; (ii) Taking advantage of research and great educational opportunities; (iii) Taking advantage of scholarships; (iv) Looking to settle abroad; (v) Status in the society. Most of these factors will impact very few students. Small percentage of the impacted will be replaced by the students next in line. However, the true opportunity is that over a period, India can turn into a regional “Educational Hub”. It will attract students from Mid West Asia, South East Asia and Far East Countries. This will subsidize the cash outflow.

Another area, which will be interesting to watch is the overall curriculum, learning and development of students in the classroom and outside. Most of the higher education institutes focus more on theory rather than training newcomers in the practical aspects of starting work in – corporate, government, public sector, non-profit sectors etc. Most freshers go through on-the-job training before being exposed to the real action. This change if and when happens will definitely impact the job market positively in my opinion. It’s too early to comment more on this. Your comments are welcome.
The bill calls for keeping the money in corpus fund and the surplus needs to be invested back in the development of the education sector. This could prove vital in creating the infrastructure – which fosters innovation.

The bill does not allow distance learning programs by foreign institutes which I see as a good opportunity for local institutes (discuss later). Bill also does not allow twinning programs so it will be interesting to watch if foreign institutes start from ground-up or find loop holes to acquire an existing local institute and turn it into their own brand.

Few open ended questions are:
1. Will foreign faculties also be hired?
2. What will be the value of these degrees/certificates in the foreign job market?
3. What are the schemes provided for students to get global exposure?
4. What will be the impact on the cost of the education in India?

India has a huge appetite for quality higher education. As long as institutes don’t run in the bandwagon of giving degrees/certificates just for the sake of making money instead focus on – (a) Developing core qualitative processes with customer friendly services; (b) Design and create an atmosphere that nurtures real life exposures, fostering an innovative culture. India alone has a huge talent pool. Adding regional talents will cultivate more entrepreneurism. Nurturing changes does not happen overnight. It’s a building block approach and this bill brings us one step closer. I believe that in the long run this competition will change the face of our society – An Innovative India, we all have eagerly been waiting for.

Top 4 Challenges for Education in India

India is the largest democracy with remarkable diversity among its population of 1.2 billion which makes up about 17% of the world’s population. Almost 70% of Indian population is rural. The adult literacy rate stands at about 60% and this is significantly lower in women and minorities. Education in India comprises of government, government aided and private institutions of which nearly 40% are government. With the population growth rate of 1.5%, there is tremendous pressure on the education system to provide quality education at affordable price and improve the literacy rate.

Education in India faces following primary challenges:

Quality

Maintaining standard of education in more than a million schools nationwide, offering training programs to teachers, and keeping good balance with education system worldwide is a big challenge. Schools vary in size and resources and are forced compromise in the all round development opportunities they must provide to students.

Access

Having infrastructural constraints and social issues, it becomes harder to make education accessible to all segments of the society (women, minorities, poor).

Cost

The cost of education is very high even for the people and places where it is accessible. E.g. the competitive pressure on students & parents forces them to opt for private tuitions & trainings to supplement the school education.

Social & Cultural

The ethnic diversity in India poses challenges to implement consistent education nationwide. There are more than 300 languages spoken in the country and makes it difficult to offer education tailored to specific social segment. Educating women in some societies is a big issue. Children of poor families are forced to work and miss out the learning opportunities. Illiterate adults have very limited opportunities to get educated at later age in their lives.

Online Education System’s Advantages

An educational system augmented by online components presents unique opportunity to solve multitude of challenges in quick time at affordable budget. Here is an overview of advantages of an online system.

Improve Quality of Education

  • Computer aided adaptive testing
  • Encourage collaboration among students, teachers, parents, alumni, activists & institutions
  • A consistent grading system to measure and rank Students, Teachers, Schools & Universities
  • Reward all round development of students
  • Promote alternate education & ideas
  • Continuous improvement by statistical feedback

Improve Accessibility

  • Online & open information portal accessible anytime from anywhere to everyone
  • Bring the books & other resource (videos of lectures, speakers) online
  • Promote distant learning initiatives to spread the education in rural areas
  • Provide online courses to students with special needs.
  • 24×7 schooling for those who cannot attend regular schools during daytime

Reduce the cost of education

  • Services at lower cost via online solutions
  • Encourage “learn yourself” and “community learning” via online system, promote volunteers by providing common infrastructure at lower price
  • Tools for teachers, schools & exam boards to offer courses and conduct examinations & assessment
  • Measurement of returns and guidance on future spending

Social

  • Online system creates anytime, anywhere engagement model
  • Online Learning from home opens the doors for girls to get education if social & cultural reasons are preventing them.
  • Promote vocational courses and self paced learning for adults
  • Bring culturally diverse India on a common learning platform which is offered in all languages

Challenges in implementing Online System

  • Planning and implementation experience
  • Short term cost overheads (online & offline must run in parallel)
  • Electricity & Communication infrastructure
  • Social issues (to some extent)
  • Logistical challenges like training of educators & students
  • Technology constraints
%d bloggers like this: