Home >> FAQ

Frequently Asked Questions

What is BULATS?

BULATS stands for the Business Language Testing Service. There are tests in 4 languages: English, French, German and Spanish. The Standard Test and the Computer Test include questions on listening, reading and language knowledge. The Writing and Speaking Tests are separate and stand alone.

[top]

What accents are used in the listening test?
A variety of native English accents including British English and American English. Proficient non-native speakers are also occasionally used.

[top]

What languages are BULATS available in and how is the Overall BULATS score calculated?

The BULATS tests are available in English, French, Spanish and German. The overall BULATS score is not simply an average of the two section scores. The program uses encrypted look-up tables to calculate the overall score as there is different ability weighting attached to the two sections. If a result in the Listening section is 50 for example, and the score for the Reading and Language Knowledge section is 60, it will not necessarily follow that the overall result will be 55.

[top]

How useful is exam preparation for improving students’ language ability?

BULATS tests reading and listening skills that are required for most purposes – not only in business. So exam preparation is valuable, even for someone not taking the exam. However, explanations and examples are included here, for candidates who haven’t had specific exam preparation

[top]

How do BULATS tests link to other language tests?

The relationship between BULATS and other language tests

BULATS links to the Cambridge ESOL framework of language levels which is recognised around the world.

Cambridge ESOL is a member of ALTE - the Association of Language Testers in Europe - which has eighteen institutional members testing their own languages as a foreign language. Currently fifteen languages are represented.

As part of the Cambridge ESOL system, BULATS links to the ALTE framework of five proficiency levels and the Common European Framework Levels. These levels are based on the work of the Council of Europe and provide an international and multi-lingual basis for comparison of language proficiency in different languages. An attached document illustrates how other language examinations link to the same ALTE Framework.

How these links are established

  • the ALTE Framework which establishes  what represents each level in each language
  • a common set of statements of ability which are being validated against tests for each language
  • standardised test specifications across the different languages

For the Standard BULATS Tests in all languages, each test is linked to the ALTE Framework through the use of 'anchor tests' and statistical analysis of the results of trialling. These anchor tests are used to measure the difficulty of each question in relation to a fixed scale which has been established through extensive research on more than 1000 candidates, of a wide range of nationalities, at each level of proficiency.

For the Speaking and Writing Tests, levels are established by matching them to standards of performance indicated in an extensive databank of performance built up over many years of running tests of speaking and writing.

 

[top]

Why shouldn't companies just use the tests used by local language schools?

For the Speaking and Writing Tests, levels are established by matching them to standards of performance indicated in an extensive databank of performance built up over many years of running tests of speaking and writing.

  • Producing reliable and relevant language tests is a specialised skill; using tests produced by language schools/consultants, whose expertise is in training not in producing tests, will mean a lower level of quality - weakness such as questions with more than one answer or questions with no correct answer, questions which depend on world/cultural knowledge rather than language skills, questions which focus on trivial skills rather than key skills, questions where candidates can guess the answers without understanding the text, typing and linguistic error, etc.
  • Language schools/consultants are not in a position to do the research  necessary to establish a fixed scale of language ability to underlie the test results - this means that there is no real basis for saying with any confidence what level of ability is indicated by particular scores.
  • BULATS is based on a fixed scale of language ability and this can be related to leading language examinations in Europe.
  • Language schools/consultants are not able to trial their tests as extensively as is done for BULATS. In most cases, language schools/consultants will not trial their test at all.
  • Language schools/consultants are not able to measure the exact difficulty of each question in the test and therefore they will be unable to prove the complexity of their examinations to the ALTE/Council of Europe Framework.
  • BULATS can provide equivalent tests in English, French, German, and Spanish.
  • BULATS is completely independent of the training provided or offered. Companies can have complete confidence that the results provided by BULATS are not influenced by any other interests - such as the need to demonstrate progress in existing courses or the need for further training courses, etc.

[top]

Since BULATS is divided into Standard Test, Computer test, Speaking test and Writing test, do employees have to take the Standard Test, the Speaking Test and the Writing Test on the same day?

BULATS offers an organisation the flexibility to assess their staff, trainees or applicants in whichever way they like. They can use just one test (e.g. the Standard Test) or they can use all three tests. They can make their staff take all three tests in one day or on three different days. The client organisation is able to choose whatever strategy they think most useful - though of course the Agent will advise the client on what options are most likely to meet the client's requirements.

[top]

In the listening test, is it necessary for candidates to understand every word?

No, they just need to pick out the information which is being tested.

[top]

Several candidates scored zero in the Listening section of the test. Their scores in the Reading and Language Knowledge test were much higher. How is this possible?

It is rare for candidates to get a zero in their listening test because even by guessing some of the Listening questions, a candidate can at least guess a few correctly. The zero that these employees received was due to the candidate either skipping questions or not completing the test in the 75 mins. allotted time (if set). As the CB BULATS test is an adaptive test, if candidates choose to skip questions or not answer enough questions, the test simply cannot calculate their ability in that skill area.

To avoid this happening you should:

  1. ensure that all candidates know that they have to answer all the questions put to them during the test.
  2. ensure that they finish the test prior to the time limit set (if this has been set in the Supervisor mode).

[top]

Are the question types the same in the Computer Test and the Standard (pencil and paper) Test?

The BULATS Computer Test shares many of the same question types as the Standard paper test. The Standard test also includes a form-filling item type, an error correction activity and a multiple matching listening task.

[top]

Should I set a time limit for the Computer Test?

Each test will vary in length due to the adaptive nature of the test although tests should ideally be completed within 40 - 75 minutes. However, the test should not continue more than 75 minutes and should not take less than 40 minutes.

[top]

What if half my staff take the Standard Test and the other half take the BULATS Computer Test?

The two types of test are fully compatible. The results from the Computer Test mean the same thing as the results from the Standard Test. However, organisations should also bear in mind that not all candidates get exactly the same result on both tests. There can be a number of reasons for this - firstly, different people react differently to computers generally - some are technophiles and some are technophobes - and this can affect their performance. Another reasons could be that the adaptive nature of the Computer Test reduces the likelihood of lower level candidates being overwhelmed by the difficulty of the test.

[top]

Why is it important for scores to be reported on a European standard?

More and more companies are working in partnership with companies from other countries - whether this is the result of mergers, takeovers, joint ventures, closer supplier/distributor chain relationships, or other types of relationship. Communication is essential for effective and efficient working practices, and it is vital that the same standards of language skill are used across these international partnerships. What companies need these days is a single system of describing different levels of language ability. And this single system should not just apply for one language, but for a whole range of key languages.

Most companies do not have the time or expertise to establish a framework of standards in language ability within their own company, and they would certainly find it very difficult to link these to standards used by other companies they work with. What is needed is a properly researched, relevant and internationally accepted framework of levels which any company in the world can relate to, whatever their particular needs.

ALTE (the Association of Language Testers in Europe) has been working for many years on establishing such a framework for all European languages. There is no system in the world with the same depth and breadth to its framework. For example, with the ALTE Framework, a multinational company in France can apply the same standards of language skill for workers coming to the French headquarters and who need French, for their staff who work in international business and need English, and for their staff who are frequently communicating with their Argentinian suppliers and so need Spanish. The ALTE framework has now been linked to Council of Europe levels which have a wide currency both in Europe and further afield.

BULATS is the only system in the world that can provide this single system in such a reliable and practical way.

 

[top]

What sorts of materials are used in the test?
All materials are adapted from authentic business materials, such as articles from business magazines, company literature, business correspondence, presentations, discussions, etc. The recordings for the Listening Test are scripted and nearly all have one or two speakers

[top]

How reliable are the test results?

An estimate of the test's reliability gives an indication of how far a difference in score on a test is significant and not just the result of chance. The most common estimate of reliability for this type of test is the alpha coefficient, a figure between 0 and 1, where a value close to 1 indicates high reliability. BULATS tests have alpha coefficients over 0.9, which conforms to accepted international standards of reliability for tests of this kind.

Margin of error

For BULATS, one can be confident that a Band 3 is clearly distinct from a Band 5. With contiguous levels, e.g. 3 and 4, there will be an element of uncertainty around the cut score. The standard error of measurement (SEM) for a BULATS test is about 4 point on the scale for the overall test result (0-100). We would therefore recommend that, when clients are making important decisions on the basis of these scores alone, they should allow a margin of 5 points on either side of any benchmark levels for the final overall score.

For greater certainty, more evidence of ability is required and clients are encouraged to use the productive skill modules. These are particularly suitable for discriminating at the top end of the scale (level 3-5). These modules provide an indication of strengths and weaknesses in different skills in addition to a band score.

Section scores

Section scores are less precise than the overall score. For each section score, we recommend a margin of 5 points on either side of any benchmark levels. i.e. a margin of 5 points on the section represents a wider margin of uncertainty.

 

[top]

What is a 'good result'?

BULATS gives information on what level of ability each candidate has in the language tested. It does not in itself say what is a 'good' level. Candidates are placed in a framework of five levels, and descriptions of what each levels mean in practical terms are provided to clients.

[top]

How the 'Can-Dos' be linked to test results? How can it be proved that someone at BULATS level 3 can typically carry out the same tasks as someone with First Certificate in English (FCE), etc?

The 'Can-Do' statements were originally produced by anylysing the content of the Main Suite examinations and deducing which real-life skills would be compatible with the skills tested at each level.

As the result of extensive research, Cambridge ESOL has produced a Common Scale of Language Ability. This is a fixed scale - a yardstick - on which all measures of language ability can be placed.

Underlying this is the assumption that, despite variation in candidates' skills in particular areas, we can sum up their overall ability as a single score. Two candidates may have the same overall ability although one is stronger in speaking skills and the other is better at understanding reading texts.

It is then possible to look at the correlation between the scores candidates get on particular types of test and their overall score. From this analysis we find that certain types of test correlate highly with overall ability. Where possible, it is better to test all relevant skills; but where that is not feasible, we can make use of those types of test which correlate most highly with overall ability.

 

[top]

How many hours of teaching is necessary to go up one ALTE level?

It is impossible to give a clear answer to this because it depends on many variable:

  • the intensity of the training (100 hours continuous training may be less effective than 100 hours part-time)
  • the opportunity for practice outside the classroom training (especially whether the trainees are in a country or company where the target language is spoken, but also issues such as time spent in self-access centres, quantity of homework, etc.)
  • the quality of the training
  • the motivation and aptitude of the learner
  • the starting level - beginners progress more quickly than advanced learners

But as a rough guide, the Council of Europe suggests that the average learner should move up a level with 180-200 learning hours including independant work. A full time employee on a training course with 3-hour tuition and one or two hours 'homework' each week would probably take about one year to move up an ALTE level.

[top]

What level of English is required to understand the test?
The Standard Test contains a range of items at all levels. In general, the more difficult Reading and Language Knowledge questions are in Part 2 of the test. The Computer Test is adaptive and selects questions based on the previous answers a candidate gives.There is no ‘pass mark’: candidates are placed in one of six levels based on their score

[top]

What level does an executive need to have in order to work with a native speaker, particularly for speaking?

As a general guideline, and assuming that the executive needs to operate independently in a typical range of managerial tasks, we would advise that the learner should be level 3 or better, preferably with a minimum of level 4. This would almost certainly apply to speaking skills as well, though it may not apply to writing skills in all cases, e.g. where the manager only has a need to write short note-like-emails.

However, a more accurate answer would depend on the tasks involved and the degree of independence and responsibility the executive has in those tasks. Some examples of the type of task which we distinguish from the point of view of language level are:

  • requesting work-related services
  • providing work-related services
  • participating in meetings and seminars
  • following a demonstration
  • giving a presentation
  • understanding correspondence
  • writing faxes, letters, etc.
  • understanding reports, journals, etc.
  • understanding notices and instructions
  • taking phone messages
  • making outgoing phone calls
  • making travel and hotel arrangements

With more precise information about the tasks and responsibilities of the executive, it is possible to give a more precise and useful answer.

[top]

Do candidates have to wait for a certain time before re-sitting the test? If so,why?

There is no official minimum time lapse requirement before a candidate resits a BULATS test. However, we advise clients not to put candidates through the test again until they have been through at least 100 hours of training over a period of more than 3 months. This is because BULATS is not designed to detect small improvements in ability. If a candidate resits the test after a short period, he or she may get the same score or may even get a lower score.

Occasionally, candidates request to re-sit the test because they felt they underachieved significantly the first time round - because of illness or misunderstanding what was required of them, or some other reasons. This is a different situation, since the purpose of the second test is not to pick up any improvements in a short period. Therefore we would not particularly advise against this - though the client may, of course, not want to allow candidates this option  

[top]

Is BULATS internationally recognised?

Yes. BULATS is used increasingly by organisations throughout the world including private companies, government departments, NGOs, and embassies. Some multinational organisations use it as a global benchmark throughout their offices. For the most up to date information check out www.bulats.com.my

 [top]

How do I decide whether to take the speaking, writing or standard BULATS test or some combination of them?

A key feature of BULATS compared to other tests of English is that it offers you a choice. Ideally, it is recommended to use a combination of the BULATS Standard Test with the Speaking and/or Writing Test to offer a more complete picture of a person’s English competence. In making your decision it is important that you consider what you want to use the test for. For example, some organisations use the Standard Test at an early stage in the recruitment of staff to check on a minimum English competence of a large number of applicants. They may then use the Speaking and/or Writing Tests in the final selection from a few candidates. If a post requires a high level of writing competence for emails or reports then it makes sense to use the Writing Test. Similarly, if a person needs to speak English to deal with customer enquiries or in working with English speaking staff then use the Speaking Test.

[top] 

BULATS has been carefully designed to offer a premium quality service with the reliability of a double examiner marking system, flexibility in its test dates and venue, and a fast turn around of results. The costs involved in maintaining this service point to BULATS inevitably being priced at the top end of the market. A key question is the value an organisation places on raising the standard of English of its staff. BULATS offers a quality option.

[top] 

How do BULATS scores compare to those of other international English test?

BULATS focuses specifically on modern workplace English and on the effectiveness of a person’s ability to communicate in a real situation. Many other tests do not. Trying to compare different English test scores is therefore like trying to compare apples with oranges. However, underpinning the design of BULATS is the carefully researched Association of Language Testers in Europe [ALTE] set of linguistic criteria. The designers of the BULATS test, Cambridge ESOL, therefore do offer an approximate correlation between BULATS band scores and their other test product scores including IELTS.

[top]

Speaking and writing tests are double marked. Is this true? How is the final score calculated from the 2 examiners?

Yes, all Speaking and Writing tests are marked independently by two different trained examiners. For Speaking tests a second examiner listens to the recording of the test interview and gives a score. The second examiner’s score has a higher weighting than the first as they are in a position to concentrate fully on the candidate’s use of English rewinding the tape if necessary. If there is a discrepancy between the two examiner’s scores, a third independent senior examiner will be asked to adjudicate and give the final score.

[top] 

Who are the examiners for the speaking and writing tests?

BULATS Speaking and Writing examiners are carefully selected and trained. All applicants must have native speaker English competence, a recognised international qualification as a teacher of English as a foreign/second language, and at least 3 years relevant, practical teaching experience. Applicants must then successfully complete an intense training programme which includes submission of standardised scores to Cambridge ESOL in the UK for verification.

[top]