Time to balance Learning and Management in LMS using Open Learning

There are hundreds of Learning Management Systems in market and almost all focus more on Management than Learning. This is resulted from LMS vendors primarily selling their systems to HR or L&D managers who focus heavily on Management aspect of the LMS to simplify their job and showcase the ROI on L&D investment. Thus products get shaped according to the needs of Management more and learning / learner’s needs take a back seat in the whole LMS procurement discussions. If we look at any RFI or RFP for LMS, its full of requirements related to trainers and managers who are mainly concerned about access controls and analytics. In fact most LMS vendors rarely get chance to talk to end learners, who use their product 99% of the time.

Learning_ManagingDue to this limited or no interaction with end learners, vendors often participate in the industry events to get chance to meet with their available audience, i.e. e-learning / curriculum developers, trainers and managers who do share some of their learner’s experiences (that they know about) with the vendors, but its not highlighted enough in those conversations. Majority of sessions / discussions still revolve around need for more automation, notification, tracking and reporting; and vendors keep producing more and more features aligned to these requirements. Many LMS products can also be traced back to their founder’s background in e-learning industry, who used their e-learning design and/or training management experience to start a new LMS product.

Organizations have used M(anagement) centric Learning Management Systems for a long time without much difference in the outcome of learners and overall effect on the organizations in general. Trainers and managers like to measure everything from enrollment counts, completion counts, time spent, scores etc. They focus on finding the best LMS that can be used to micromanage the entire training operations effectively and put necessary access controls to block the access to courses in many different ways. This style of micromanaged training operations works well in some use cases, like compliance and HR centric trainings, but LMSs aren’t supposed to be just the Training Management Systems only, isn’t?

In this decade, we observed many new LMSs coming up with “simplicity” and “ease of use” as their core pitch. But what was not very apparent is, “ease of use” for whom? Is it for 1% of the users (trainers and managers) or for 99% of the users (learners). Our analysis suggests, most of the time “simplicity” is exclusively targeted to attract content creators, trainers and managers who are the decision makers, or act as an influencer in LMS purchase decisions. If you talk to learners (or play that role yourself by taking online courses) you will realize how closed and boring today’s LMS systems are from their perspective. They are mostly used as a platform to deliver SCORM content and to enroll in Instructor led classes.

When we started EduBrite, we had no background in e-learning/training industry. We only knew one side of the LMS in our imagination, the Learner’s side. We took inspiration from open web and created the foundation of EduBrite to support “Open Learning” from day one. But we had to go thru validation by the customers and due to established definition of LMS where M is the most significant part for potential buyers; we realized we had missed to fully analyze the buyer’s persona. Every interaction we did with prospects and customers, made us realize what additional M(anagement) feature we didn’t have. We kept adding those features to support the complex training operations, that grew the platform to a level where we could not only match but also outperform many established M focussed LMSs. Although this proved successful in growing the business, but our passion was (is) still to develop a “Open Learning” system and not only a “Training Management” System, hence we kept building Learner centric features and put whatever we built to field test by using it for our own support site / user community (support.edubrite.com) that offers our platform education to our customers. It worked well and was a self-validation of our belief that open platforms are needed for online self-paced learning. It was surprising to see not many other LMS vendors use their own product to educate their customers.

We announced “Open Learning” sometime in mid 2016, as an add-on to EduBrite LMS, but still demand for it from traditional buyers was not there, as at the moment most internal employee centric education still revolves around HR/onboarding/compliance stuff, which limits the amount of time learners have to use the LMS. On the other hand, we found many different use cases (and buyer persona) that were a better fit for Open Learning, e.g. VMware’s customer education site – vmwarelearningzone.vmware.com which uses EduBrite Open Learning to a great extent similar to how we used it ourselves on support.edubrite.com .

Open Learning helps both marketing as well as customer success teams (besides trainers and managers) by offering engaging content to all users (even anonymous), and allowing learners to easily find the bite-size lessons and build their knowledge bit by bit. Once learners have more time, they can enroll in larger units like Courses and Programs (learning plans) or even instructor led classes that are related to their interest and needs. They get ability to engage in meaningful conversation with the community, self claim mastery points and create their own curated Playlists, rather than going thru pre-built courses which are laid out in a specific sequence. While learners get their share of tools, Trainers are also not left behind. They can use EduBrite’s powerful features to offer instructor led trainings and advanced certifications.

EduBrite’s Open Learning combines the power of traditional LMS, fun of Learner centric Microlearning, freedom of Community and wisdom of Knowledgebase to offer a full suite of tools needed for customer as well as employee education. This approach finally finds a balance between L and M in the LMS platform.

EduBrite 2016: A year in review

It was a great year for EduBrite. Many incredible things happened in 2016. Thanks for being part in this journey with us. 

We introduced Open Learning, a learner centric micro learning solution in 2016. EduBrite is one of the first one to offer this solution in the LMS industry. Open learning helps creating an engaging learning environment for your employees as well as external audience – customers, partners etc. You can setup a public facing Open Learning solution and leverage it to serve as a great tool for content marketing.

Users can create their own play list, follow, rate, comment. They can also social share the content with their peer group and claim mastery point in the subject area they are expert in. 

Open Learning is also a gateway for progressive learning. It directs users to register in course(s) to learn their topic of interest formally. So essentially you can offer an integrated experience to your learners.

In addition to Open Learning, we have added many new features, enhanced existing features, improved usability and added automation with key goals in mind to continue to improve the experience of our customers and their users. Here is a quick snapshot:

Gamification – You can create badges easily by uploading external images or use from the library. Enhanced leaderboard is available.
Instructor led training – New integration with Zoom.us. New functions e.g. download sign-in sheet, manually generate invoices, edit amount, add discount, take notes, easy options to mark attendance and awarding completion.
Collaboration – Forum now offers features such as forum priority, assigning, sub forums etc. to provide light weight task/issue tracking capabilities.
Custom properties – User defined custom properties now can be added at various places e.g.  Group, Course, Course Session, Program, Event, Course Member, Program Member and Event Participants. This allows you to extend the LMS objects in a very flexible ways to capture your business specific information. 
Authoring – Many enhancements are added in test and course throughout the year. 
New Role – A new role group coordinator is added to offer additional flexibility in decentralizing the access and management of training operations. 
Reporting – Enhanced many reports e.g. learners report, program report, question statistics report etc. Added a new transcript summary report.
Confluence Integration – In addition to many new macros e.g. customizing dashboard, reports, leaderboard etc. now you can auto sync users and groups.

We make it easy for you

We make your experience easy by offering our integration with Atlassian Confluence, Salesforce, Yammer, Google etc. This allows users to access EduBrite LMS right from the Apps where they spend most of their time. Additionally, we have integrated with many wonderful applications so that you use best of breed applications and provide a great experience to your learners.


We have a pretty exciting 2017 ahead of us! One of our primary focuses will be to offer an even more engaging learning experience, so we have planned for several enhancements in Open Learning and Gamification. We understand that you want to offer ubiquitous learning, so expect more developments in our native iOS App and our recently released Android App.


Thanks for an amazing 2016. Happy Learning in 2017!

Top Down & Bottom Up LMS implementation approaches for employee training

When it comes to LMS implementation in any organization for internal training, most common approach is top down. Top down approach is heavily based on maintaining controls at the top level and selectively giving some rights to lower levels in the training delivery hierarchy (which is generally same as org structure).


This approach requires lot of upfront planning, identifying stakeholders across the organization and resolving internal conflicts within different teams to create common ground for a centralized LMS. On many occasions, this whole process takes several months to a year before LMS can be fully rolled out to employees, thus resulting in huge upfront cost. Even after rollout, the top down control of rights creates an ongoing battle of control and sometime conflict of interest among different groups who needs to use the LMS. All these factors limit the adoption, and returns on the investment.

If we look deeper into why this happens so frequently, we find the biggest cause is a common issue with many LMS products; most of them are built with assumption of global role based controls. In most systems, only admin or instructor can do activities like course creation and setting up its delivery. These rights are not available for individual team or group level, resulting in big process bottlenecks. Course content has to be managed by a few selected people (L&D managers or Curriculum developers), for a very large number of teams. Team leads do not get the ability to quickly create their own training courses and deliver them to their teams. This also makes the job very tough for the L&D managers who typically are at the top of the Training Delivery hierarchy. They must find time to create & facilitate training courses and provision them for entire organization. This process takes too much time to plan and becomes difficult to execute. As a result, not all departments get equal attention and become indifferent to LMS or find their own (adhoc) solutions. A clear symptom of this effect we see is – existence of multiple LMS in the same organization, which is owned by different teams. Although this gives full control of the LMS these teams own, but as a whole the organization doesn’t get a clean & coordinated learning environment, and the cost becomes too high.

Top down approach has its merits in many kinds of training such as compliance, that requires full control, but there are several other kinds of training usage (especially informal) where it becomes a limiting factor.

EduBrite offers a clean solution to this problem by allowing possibility of a Bottom Up implementation strategy. We discussed this topic in our webinar last week, recording of which can be accessed here – http://bit.ly/1mSodHO


Bottom up approach works by allowing everyone to create training content (courses) and managing its delivery. LMS implementation can be done at rapid speed with just a potential announcement of its availability and may be a “getting started” video, and allowing different team leads to start using it for their teams. No setup of org hierarchy / departments / roles etc. is needed. The team, who need the LMS most, can be the first to adopt, and lead the way for other teams to follow. Leaders or experts can join hands and create community groups (super groups) as well by merging or sharing resources from their own group and evolve onto a organically grown Group structure for training/learning activities.


Regardless how large the overall organization is, team level implementation seems very simple and quick. L&D managers can still be overall admin for the system and can visualize the system usage, and other analytics about the adoption. They can even create a healthy competition among teams to make best use of the LMS. From the cost perspective also, you can get high return on investment by not buying a large number of seats for the LMS upfront; rather follow the scaling model based on the demand growth.

Since Bottom up approach is based on participation by teams, it becomes a more stable and likely more successful implementation, compared to top down model. If LMS permits (like EduBrite does) you can also have mixed implementation approach in the same system.


Key product features that allow implementation of bottom up strategy are –

  • Allows training content creation, ownership and provisioning rights to all users, so they can develop and deliver trainings for their teams
  • Allows Group level roles, sharing content across groups
  • Allows multiple group membership by the same user, so they can play different roles in different groups
  • Makes it easy to evolve the group hierarchy and allows possibility of multiple alternate hierarchies to co-exist in the same LMS
  • Allows re-use of training materials to create different variations or courses and programs by re-packaging it to make it suitable for different groups
  • Hierarchical group based permissions for administration, data visibility and reporting

Best implementation strategy heavily depends on the specific usage and may be different in each organization, but having familiarity with the options and availability of the right features in the LMS can give you full flexibility.


SCORM Quiz – Item Analysis issues & solution

SCORM is widely used in the eLearning community so I will not get into what it is, rather I will get straight into the fundamental issue it presents for a Learning Management System (LMS) from Quiz reporting perspective. This is based on my first hand experience while developing EduBrite LMS and having seen a variety of SCORM content thru several customers.

Most LMSs (including EduBrite) have some kind of built in Quiz creation feature. (We are focusing the discussion only on LMSs that provide quiz-authoring capabilities). As a eLearning content developer you have option to use the built in Quiz feature or embed the Quiz questions inside a SCORM package that you can create using authoring tools (like Storyline or Captivate). You can even hand code a SCORM if you are taking deep dive into 700+ pages specification and have reasonable experience with Javascript.

In this article I will discuss the implications of your choice, from the reporting perspective between SCORM based quizzes vs natively created quiz in LMS. This will also help in setting the right reporting expectation from LMS, an eLearning developer can have.

Generally, for the Quizzes created in LMS, we have seen far superior and usable reporting but for SCORM based quizzes, the reporting doesn’t go that far or isn’t that usable especially from the non-technical user’s perspective. And it often leads to dissatisfaction among the LMS customers, because they expect LMS to provide same usable reports, regardless of whether they are using SCORM or using built in quiz in LMS.

At EduBrite we created a mechanism based on data mining to provide same reporting for SCORM quizzes as what is available for quizzes directly built in LMS. But this feature is experimental and isn’t full proof yet to cover all scenarios, especially considering wide variety of authoring tools and few areas where SCORM specification leaves things open to implementations.

In this article I will first describe technical challenge in reporting for SCORM based quizzes, and that would explain the differences and limitation you will find when you use then in any LMS. I will also explain how EduBrite tried to solve it (although not with full perfection), and (few) shortcomings in our solution.

To set the context for remainder of this article, lets look at an example of a very commonly seen multiple-choice quiz question.


What is 10+2


  • 10
  • 11
  • 12
  • 13


Design time

First let’s look at the design time (authoring time) difference from the data awareness point, and by design time, I mean until this question is attempted by a user. When you create the quiz/question in LMS, it knows everything about it, like question id (internally assigned by LMS), question type, question statement, choices, correct answer. But when you create the same question in SCORM, and upload the package (zip) in LMS, LMS knows nothing about this question. What you packaged inside the SCORM zip file is completely opaque to the LMS, except for the manifest, which only describes SCO you have inside the package.


Let’s look at what data points LMS can get in both cases, when a student attempts the question.

A. LMS Quiz

When you use built in authoring of LMS, it is able to capture student’s answers to this question and link it to the already known question id in the LMS.

Consider that student picked up a correct answer 12 (3rd choice). LMS would immediately know that out of the four available choices, user has picked 3rd choice which was correct, when was the question attempted, how much time the user spent on the question and what should be the score for this attempt.

If the above question is attempted multiple times, by multiple students, LMS can provide an Item Analysis report about difficulty level of the question, e.g.


LMS can also provide a report to show student’s attempt and full context of the answers they selected.



Now if we were using SCORM, lets see how the situation changes. When student submits the response to the question, SCORM will send a set of data elements known as interactions in specification. For example SCORM might send something like this to the LMS –

cmi.interactions.0.id – Q1
cmi.interactions.0.type - choice
cmi.interactions.n.learner_response - 12
cmi.interactions.n.correct_responses._count - 1
cmi.interactions.n.correct_responses.0.pattern - 12
cmi.interactions.n.result – correct (we have seen variations like correct/incorrect or 1/0 in content produced in different authoring tools)
cmi.interactions.n.weighting – 1 (commonly interpreted as score or relative score w.r.t. total score)
cmi.interactions.n.timestamp – 114-01-04T21:23:37 (interaction time)
cmi.interactions.2.latency - PT00H00M02S (time spent on this interaction)

So upon receiving this data set, LMS becomes aware of this question for the first time in its lifecycle. It knows the question ID, the type of question, what was student’s response, what is the correct response, whether the student’s response was correct or incorrect, score, time of the attempt and time spent on the attempt.

Important things that LMS doesn’t know yet, which was available when question was built in LMS are –

  • What exactly was the question (statement)?
  • How many choices were there in the question, or what other choices were available to pick from, that may be correct or incorrect


We can address first of the above two points by using SCORM 2004 (if LMS also supports it). In SCORM 2004, new data element “description” was introduced for interactions. You can send following new element about the interaction to the LMS.

 cmi.interactions.0.description = What is 10+2

With this new element, LMS can report what was the question, and what was student’s answer, and whether it was correct or incorrect. But it still doesn’t know about the other available choices (other three that are incorrect but are not picked by the student).


As a LMS provider, here is how we tried to tackle this issue, and provide full report similar to questions created in LMS.

If large number of students attempts the above question in SCORM over a sufficiently long period, statistically at some point some student will pick each available choice (probability 1/4). And if LMS could correlate several interactions record to correspond to same question ID, it can learn about all other incorrect choices, or keep learning more possible incorrect choices with time.

E.g. when a student picks the first (incorrect) choice, LMS will see following data elements:

cmi.interactions.0.id – Q1
cmi.interactions.n.learner_response – 10
cmi.interactions.n.result – incorrect
cmi.interactions.n.weighting – 0

And assuming LMS has seen question id Q1 before, it can check whether it has also seen the answer 10 before or not. If not, it can add 10, to the other available choice for the same question. And it also knows that this is the incorrect answer.

Similarly when LMS sees another incorrect answer

cmi.interactions.n.learner_response – 11

it would learn that there is another incorrect option available for the same question. Eventually LMS will learn about the fourth (all) option when it sees

cmi.interactions.n.learner_response – 13

By correlating all the above interactions to same question, it can fully re-engineer how the question looks like. And now, it can provide same kind of report, as the quiz created natively in LMS. It can also show the question context when showing details of a student’s attempt.

But in order to accomplish this correlation, LMS should be able to unambiguously match question ids among several interactions (from several students) that are reported to it. The first thing that is needed is to only consider the interactions reported by the same SCORM package. And this is where the ID of the SCORM package as mentioned in the manifest can be used, along with the internal id that LMS may have assigned to the uploaded SCORM.

So it appears that we do have a solution that can give same (full) reports for the quiz question (interactions) embedded within SCORM. Nice. But there are few cases where we need to be cautious.

1. Multiple Attempts (interactions)

Multiple interactions on the same question (or re-attempts) provide an interesting case. We noticed that different authoring tools (or elearning developers) have different ways to represent the interactions ids.

Some re-use the same question id (effectively overwriting the previously stored answer) following a technique referred as Sate, while some other tools add an attempt count suffix to the question id, for each unique interaction. E.g. Q1_1, Q1_2 etc, referred as Journaling. (ref – Tim Martin http://scorm.com/blog/2010/11/4-things-every-scorm-test-should-do-when-reporting-interactions/). Although we found inconsistencies among tools in how they generate Ids even when using Journaling to not overwrite answers from previous attempts.

This presents a potential problem while reverse engineering; because LMS can’t cleanly (or consistently) correlate these interactions to the same question ID and might interpret each attempt of the same question as a new question. This effectively limits the accuracy of the item analysis because same question may be reported (or interpreted) as different question depending on the attempt (first attempt, second attempt).

Based on our analysis of several packages from several authoring tools (like created in Storyline, Captivate and few others), we have devised a pattern-based logic to derive the question id and attempt numbers accurately. But this may not be fully accurate in handling all authoring tools and ID naming conventions.

2. SCORM ID in Manifest

If content developer changes the SCORM content (questions and/or choices) but keeps the same ID in the manifest and replaces it for the existing uploaded package in LMS, the reporting can completely go out of sync. Because LMS would incorrectly correlate unrelated questions because they will be assumed to be part of same SCORM due to same ID in the manifest. This can be avoided easily by using new ID in the manifest (unless the changes are minor).

3. Randomization

If the SCORM has internal logic to randomize the questions, but it doesn’t sends the consistent interaction IDs regardless of the position (sequence), then the reporting becomes inconsistent. eLarning developers can also solve this by using IDs in consistent manner.

4. Multiple correct answers

We have noticed inconsistent behavior in how SCORM tools report the correct_responses and learner_response. Some tools embed choice identifier (like a, b, c etc) in the response, while others don’t. Similarly when there are multiple answers some use comma delimited while others use space, tab or other conventions. This is one of the open problems we are working on and based on known conventions of many tools we can solve it to some extent.

5. Probability

We assume that statistically all choices will be picked up at least ones, but practically there is no finite time-frame in which it will happen. So when you are looking at reports, you might find an incomplete list of choices for a question in LMS.

All the above problems can be avoided during SCORM content development, by having a little more closer attention to the IDs and having a perspective that what runtime data SCORM sends to LMS can be used for further correlation and analysis.

For Profit – External Training

External_Training_ProfitWe all have heard about ‘For Profit’ education and the key concept of this type of training is to have enough revenue to not only use the public funds to finance all of the business operations, but also maximize profit for the business.  For profit education spans multiple domains, from educational institutions delivering K-12 education to higher level school education, professional training institute to corporate level career education. In this article we are going to focus on the later domain where we are going to talk about the key components of ‘for-profit’ corporate level professional and career education.

Most of the corporate organizations have either centralized education services departments with the name XYZ University or XYZ Global Education Services and their main objective is to deliver training not just for internal employees for enablement and compliance but to train their customers and partners as well and collect revenue from it and hence the name ‘External’ or ‘For Profit’ training. Business objective of these departments is to maximize the training revenue by training their customers and partners (Better trained and informed customers is the most satisfied customer) which in turn results in selling more products (and training). In order to succeed in their business objective, these departments need solid infrastructure in terms of tools and technologies and that is where very robust, flexible, start of the art, easy to use and configurable learning platforms play a huge role. We are going to talk about the key components of such platform to make external training business successful.

What are the key components which are required to make external training successful? Let us look at the key features which pretty much apply to most of the businesses these days.

Easy to use User Interface: Any application or tool which is customer or end-user facing needs to be truly exceptional when it comes to serving the needs of the customers. It needs to be simple, intuitive and to be able to deliver desired results in no-time and with minimal number of clicks. Not just it needs to be user action oriented, but it also has to be intelligent and personalized to be able to recommend and prescribe trainings.

Configurability: This is another key area where most of the platform vendors are focusing on these days. Gone are the days where the applications were customized inside the firewall by the development teams. These days many of the platforms are available on cloud and as SaaS service. The configurability of the platforms is key to be able to successfully launch the tools and easily adapt within the target audience community. Plus the cost of ownership also stays low.

Training Delivery Methods: The platform needs to be able to offer standard training delivery modes of Instructor-Led, Online/Self-Paced, virtual Instructor-Led, etc.

Bundling and Packaging Ability: Complex businesses and use cases often demand complex and flexible offering structures. If the platform offers capability to construct flexible training offerings, it gives huge advantage to the business and gives ability to meet complex business demands. Any course or content can be packaged or bundled with various paths and/or rules with flexible pricing models, meeting demands of multiple use cases in various countries.

Multiple Payment Methods and Credit Card Integration: Since we are talking about the commercial business, training platforms need to be able to offer flexibility in payment methods and integration with all popular payment platforms.

Discounting and Promotions: Every commercial business also demands multiple options for delivering promotions and discounting models. It could be X% (or X amount) off certain group of trainings in certain period OR Y% (or Y amount) discount to certain audience group or market segment. It could also be promotion on new training products for certain period.

Training Units or Virtual Currency: Many times big training businesses when dealing with other big customers want to sell training retainers (similar to gift card kind of model), hence creating a training delivery pipeline which can be realized in upcoming period, typically valid for 1-2 years. Education business may sell set of training units equivalent to dollar amount which customer can use for purchasing training services as and when needed.

Subscription Models: Some businesses operate on all-you-can-eat buffet kind of model for their training business. They sell yearly subscriptions to their customers for the set of training offerings they have and then it is upto customers how they take advantage of the subscription service they have bought.

Mobile and Connected Training (on the go): Past few years have seen the trend of making training available on the mobile device. This is very helpful for the sales kind of people who can access training anytime and anywhere as they travel to their customer sites. With advancement in mobile platforms such as tablets and smartphones, this has become much easier. Learning platform which easily integrate and deliver seamless experience across devices, but still keeping the user connected to the content (along with their progress) are hot these days.

Social platform Integrations: For the learning platforms which are able to deliver the training, but are not providing social platform interface are missing big time on user engagement and informal learning experience. User’s today demand that they should be able to collaborate with their peers and industry experts. This is becoming one of the selection criteria for the learning platform vendors.

Managing delivery operations: Among all the features listed here, this is one of the key features of the learning platform. To run any business, you need to be able to manage and administer your operations. Starting from creating the content, listing out classes for enrollment, management of resources to deliver the classes and content, administering the learner experience pre and post classes, etc. is part of operations and there needs to be solid administrative foundation part of the platform to support running the business.

P&L Analysis with reporting: Finally, when you run the business, wouldn’t you like to analyze the total cost and revenue to see if the business you are investing in and how you are running it is resulting in profit or loss. Commercial learning platforms need to be able to provide reports and analytics for the operations team to give needed business visibility. Also businesses demand that it should also be integrated with corporate reporting systems for the same purpose.


Author: Praveen Khurana
Praveen Khurana is a learning technology leader in learning management and human capital systems. He has 18+ years of experience in this industry and has consulted with and has implemented learning, talent and knowledge management systems for many fortune 500 companies.

Test Authoring and Delivery – EduBrite Education Webinar

In last week’s webinar, we discussed about test authoring and delivery. In this webinar covered various how quizzes/tests or surveys can be created using EduBrite’s WISYWIG test editor? Thanks to all participants who were able to join us. We discussed following items in detail:

  • Questions formats – Multiple choice, Fill in the blank, Yes/No, Essay, Grid Type etc.
  • Various Test properties including
    • Create question bank
    • Re-using questions from existing tests
    • Inserting images, audio, video
    • Inserting mathematical and Scientific symbols
    • Gradient scoring, negative marking
    • Adding passage, solution
    • Adding metadata at question or test level
    • Randomize questions, choices
    • Competency-based test, exams
    • Various UI options
    • Categorizing questions into sections
    • Limiting time based on sections
    • Essay evaluation
    • Random selection of questions from question bank
  • Test delivery : informal (Practice tests); Formal inside an exam, exam management. Quizzes inside the course were covered in previous course authoring and course delivery webinars.
  • Proctored Exam
  • Multi Source feedback (360 degree feedback)

Recording of the session is now available at Webinar Channel in EduBrite Support portal. We will announce next round of webinar soon. Stay tuned.

EduBrite LMS Microsite administration

Microsite administration was the topic of last week’s webinar. In this webinar covered most common activities a LMS administrator typically performs. Thanks to all participants who were able to join us. We discussed following items:

  • User Management : User Creation (individual, bulk), User roles, User assignment to group(s)
  • Group Management : Group Creation, Type of Groups (open, private, etc.), group roles, group settings, collaboration
  • Site administration : Configure features, security, menu, metadata, payment integration
  • Others : E-mail settings, proxy login

Recording of the session is now available at Webinar Channel in EduBrite Support portal.

Our next webinar “Test Authoring and Delivery” is on Wednesday Sep 24 at 10 am pacific (UTC-7). In this webinar, we will cover:

  • Test / Survey creation
  • Questions formats – Multiple choice, Fill in the blank, Yes/No, Essay, Grid Type etc.
  • Test properties : Basic and advance
  • Test delivery : informal (Practice tests); Formal inside course or as an exam
  • Reporting

Register here for next webinar. From last week online webinars are being delivered directly from EduBrite LMS. For more details on how to register and join, watch a mini video?

%d bloggers like this: