Function Repository Resource:

RandomTextCompletionQuestion

Source Notebook

Generate a text completion (cloze) language test item from a random Wikipedia article

Contributed by: Mark Greenberg

ResourceFunction["RandomTextCompletionQuestion"][]

returns a question object based on a passage from a random Wikipedia article that has six blanks.

ResourceFunction["RandomTextCompletionQuestion"][b]

returns a question object with b blanks.

Details and Options

A text completion question, or cloze question, occludes random individual words in a text passage, requiring the user to fill in the blanks. This question format can be used in a variety of assessment tasks. One common use of the cloze format is to assess the level of understanding for a student acquiring a second language. A cloze test can be administered with multiple choices to fill in each blank, which requires the user to rely more on the forms of the words and their parts of speech rather than their meanings. Without choices, a cloze test causes the user to rely more on the semantic meaning of the passage. Typically, if a cloze assessment requires a free-form answer, then it needs to be hand scored to allow for synonyms and spelling variations.
ResourceFunction["RandomTextCompletionQuestion"] returns a list of two items, a QuestionObject and a list of the answers.
ResourceFunction["RandomTextCompletionQuestion"] can take the following options:
"DistractorType""Inflections"how to choose the distractors
"QuestionType""SelectCompletion"the form of the output
Possible values for the "DistractorType" option include the following:
"Inflections"(default) distractors try to match the parts of speech and the inflectional endings of the answers
"PartsOfSpeech"distractors match the parts of speech of the answers
"Random"distractors are random words
Possible values for the "QuestionType" option include the following:
"SelectCompletion"(default) blanks in the passage are drop-down menus, each with nine choices
{"SelectCompletion", n}blanks are drop-down menus with n choices (1 ≤ n ≤ 9)
"TextCompletion"blanks in the passage are text boxes
"Parts"output is a list: {q, a,{d1,d2,}}, where the string q is the passage with blanks, a is a list of answers, and di are lists of distractors.
When ResourceFunction["RandomTextCompletionQuestion"] produces a QuestionObject, the user's results are reported including what words they chose, the correctness of each answer, and the overall score for the question in fraction form.
ResourceFunction["RandomTextCompletionQuestion"] gets text from English-language Wikipedia articles, so some service credits are consumed.

Examples

Basic Examples (1) 

Get a random text completion question with answers:

In[1]:=
ResourceFunction["RandomTextCompletionQuestion"][] // Column
Out[1]=

Options (7) 

QuestionType (4) 

The "QuestionType" option by default is set to "SelectCompletion", so the blanks in the question are drop-down menus:

In[2]:=
ResourceFunction["RandomTextCompletionQuestion"][2] // Column
Out[2]=

With QuestionType{"SelectCompletion",n} you can control how many choices are given in each drop-down menu:

In[3]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "QuestionType" -> {"SelectCompletion", 4}] // Column
Out[3]=

When "QuestionType" is set to "TextCompletion" the blanks in the question are text boxes:

In[4]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "QuestionType" -> "TextCompletion"] // Column
Out[4]=

When "QuestionType" is set to "Parts", the result is a list of the passage with blanks, the answers, and distractors for each blank:

In[5]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "QuestionType" -> "Parts"]
Out[5]=

DistractorType (3) 

When the "DistractorType" option is set to the default, "Inflections", the result has distractors that mimic the parts of speech, and often also the endings, of the correct answers. For instance, if the correct answer is the verb catching, then distractors for that blank will be verbs ending in -ing:

In[6]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "DistractorType" -> "Inflections"] // Column
Out[6]=

When "DistractorType" is set to "PartsOfSpeech", the result has distractors that mimic the parts of speech, but not the inflection endings, of the correct answers:

In[7]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "DistractorType" -> "PartsOfSpeech"] // Column
Out[7]=

The third allowable value for the "DistractorType", "Random", fills the distractor lists with random words without regard to their parts of speech and inflectional endings:

In[8]:=
ResourceFunction["RandomTextCompletionQuestion"][2, "DistractorType" -> "Random"] // Column
Out[8]=

Applications (4) 

Make a hand-scored text completion (cloze) test question suitable for printing:

In[9]:=
cloze = ResourceFunction["RandomTextCompletionQuestion"][2, "QuestionType" -> "Parts"];
prompt = "What two words might fill in the blanks?";
answers = cloze[[2]];
passage = cloze[[1]]; 
StringForm["What two words might fill in the blanks?\n\n`1`", Style[passage, 18, FontFamily -> "Times New Roman"]] // Framed
Out[12]=

Make a computer-scored cloze quiz question:

In[13]:=
cloze = ResourceFunction["RandomTextCompletionQuestion"][2];
answers = cloze[[2]];
question = cloze[[1]]
Out[15]=

Make a free-response cloze question with answers the teacher can compare to the student's responses:

In[16]:=
cloze = ResourceFunction["RandomTextCompletionQuestion"][2, "QuestionType" -> "TextCompletion"];
answers = cloze[[2]];
question = cloze[[1]]
Out[18]=

The student's response can be examined by clicking the clipboard icon that appears:

In[19]:=
cloze = ResourceFunction["RandomTextCompletionQuestion"][2];
answers = cloze[[2]];
question = cloze[[1]]
Out[21]=
In[22]:=
AssessmentResultObject[<|"Score" -> Rational[1, 2], "AnswerCorrect" -> False, "GivenAnswer" -> {"tiger", "family"}, "Explanation" -> {}, "ElementInformation" -> <|"Scores" -> {0, 1}, "AnswerCorrect" -> {False, True}|>, "Timestamp" -> DateObject[{2025, 3, 17, 13, 55, 35.636829}, "Instant", "Gregorian", -7.], "AssessmentSettings" -> {"ListAssessment" -> "SeparatelyScoreElements"}, "AnswerComparisonMethod" -> "String", "SubmissionCount" -> 1|>][{"Score", "GivenAnswer"}]
Out[22]=

Publisher

Mark Greenberg

Requirements

Wolfram Language 13.0 (December 2021) or above

Version History

  • 1.0.0 – 26 March 2025

Source Metadata

Related Resources

Author Notes

RandomTextCompletionQuestion pulls random and unfiltered articles from Wikipedia. Also, the distractors come from the Wolfram WordList, which may contain words considered inappropriate for some settings. Therefore, in uses where avoiding obscenities and other potentially unacceptable content is important, it is advisable to review the output before displaying it to the end user.

License Information