1. Study Participant Profile
The Education Landscape Study Coordinator should identify at least 20 Study Participants for the defined region.
Ten Study Participants from Educational Institutions: academics representing universities and programmes in the architecture, construction and built environment within the Defined Region.
There might be more than one respondent per university. For example, an academic representing an architecture programme, and another academic representing a construction management programme.
Ten Study Participants from Training Institutions: representatives from firms/institutions providing BIM training as short courses. This may include technology service providers, specialised training providers, professional bodies, industry associations, technology advocates, etc.
Identify at least 10 participants from Educational Institutions and 10 participants from Training institutions.
The sampling strategy for academics should recruit participants from different areas within the defined region as they will be responding about their own university programmes (e.g., having at least one academic from each country or state in the Defined Region).
Training institution representatives will answer about availability in the Defined Region as a whole, not about their own courses, therefore, geographical distribution is not necessary for this participant type.
2. Study Method
Step 1: Send the online survey to vetted study participants
- Study Participants should adhere to the specified profile
- Study Participants complete the survey
- Study Coordinator ensures that all online surveys are completed and submitted
Step 2: Validation of Answers
- The Study Coordinator analyses the data and identifies all the divergent claims/answers
- The Study Coordinator inspects the Evidence provided against the (LoE) metric
- The Study Coordinator distills an answer that is closest to ‘reality’
Step 3: Publication of Results
3. Study Validation
Due to the nature of the study, some responses must depict a single ‘reality’ (e.g., Coverage of an Educational Framework in the Region). The review of Study Participants’ survey responses would reveal the following two possibilities:
- Answers converge (most are the same - there is practical consensus)
- We accept the answer/claim
- Answers diverge (there is no consensus)
- The Study Coordinator will try to investigate which of the answers reflects ‘reality’ and enter an answer and add notes justifying the choice.
Level of Evidence (LoE) is the number and type of artefacts requested for inspection by the study team during a BIMe Assessment. The Level of Evidence (LoE) includes 5 levels:
- [0] no claim (Availability Question Not answered, Answered as a No, or Answered as I don’t Know)
- [1] claimed (Availability Answered as a Yes)
- [2] exhibited/demonstrated (Artefact Uploaded)
- [3] inspected/analysed; and
- [4] audited/certified. LoE also applies to types of questions posed, or types of answers expected, during BIMe Assessments
Some sections of the Educational Landscape Study will need to provide some Level of Evidence (e.g., website, syllabus, papers, etc.). For instance, Content of Educational Units (syllabus), Research Deliverables (website, links to published papers), etc. The Study Coordinator will rate each answer against the LoE.
4. Production of a report
The Study Coordinator shall publish a national BIM adoption report aligned with the BIMei reporting template. The report will include both quantitative statistical analyses and qualitative assessments of BIM adoption and maturity across the region. An executive summary with recommendations to improve regional BIM adoption will be provided. To maximize accessibility, the report will be published digitally in English and translated into relevant regional languages. Both domestic and international publicity of the report is critical to benchmark progress and inform future BIM adoption efforts in the region and globally.