New Turnitin AI writing score column in the Assignment Inbox

From 5 August 2025, Turnitin has introduced a new AI writing score column within the Assignment Inbox of the classic Standard Assignment in Feedback Studio. This feature is retroactive and applies to previously submitted work.

The AI writing score column is only visible to instructors on the course. Students do not have access to it.

AI Writing disclaimer
AI Writing disclaimer

Before the AI writing scores become visible, users must first read, scroll down and accept a disclaimer. This will appear the next time you access a Turnitin assignment and only needs to be accepted once. If the disclaimer is declined, the AI score will remain blurred in the submission list.

If you have any questions about this feature, please get in contact with digital-education@bristol.ac.uk.

Ultra Update – August 2025

One of the advantages of our move to Blackboard Ultra is the ongoing potential for new feature releases and updates. New releases take place on the first Thursday evening of every month, which is much more frequently than for Blackboard Original. We will communicate about new features and changes via this blog. If you want to be kept up to date with the latest information, we recommend subscribing to the blog. 

This release has new or updated features in the following areas:   

  • Instructional design; 
  • Communication and collaboration; 
  • Gradebook; 
  • Blackboard Core. 

These improvements impact instructors, students, and administrators. Below you’ll find our highlights, and more detail on the most relevant updates.

Highlights

Table of contents and full-width assessment panel in Learning Modules

Several user interface and navigation improvements have been made to Learning Modules. Learning Modules now include a collapsible table of contents and assessment items appear in a full-sized panel.

An image of a Learning Module table of contents. It's a panel showing a list of items in the module with a green tick next to those that have been completed and a green progress bar at the top. There's also an arrow in the top right of the panel which has been highlighted.
Image 1. a Learning Module table of contents. The panel can be collapsed by using the arrow button at the top.

With the Force sequence option on, students must use the Next andPrevious buttons to move through content in order. Students can’t use the table of contents to jump ahead unless they have already completed the item they are navigating to.

An image of a learning module. The collapsible table of contents with a green progress bar and completion indicators is shown on left hand side. The Previous and Next navigation arrows in the top right of the interface are highlighted.
Image 2. The Previous and Next navigation buttons now appear closer to each other in the user interface within Learning Modules.
An image of a learning module with the assessment item full-width panel displayed. The panel shows the assessment due date, the maximum points and attempts remaining.
Image 3. Assessment items within Learning Modules now appear in a full-sized panel.

Please see the Release Notes for further information.

Other updates include:

Please see the Release notes for further information on these updates and others in this release.

Further help and support

For more guidance on using Blackboard Ultra, please explore Ultra Essential Guidance. 

Students can view an Ultra orientation and advice on using Ultra from their Help for Students sections (or equivalent) in your Blackboard course.

Scheduled Turnitin Maintenance, Saturday 9 August, 2025

Turnitin have informed us that their services will be unavailable due to scheduled maintenance on Saturday 9 August between 16:00 and 22:00 BST. During this maintenance window, users will be unable to submit or mark papers, or to view marks, feedback or similarity reports.

Further information is available on the Turnitin status page.

Please accept our apologies for any inconvenience this may cause.

Questionmark OnDemand Scheduled Maintenance 16 August 09:00 – 21:00

Questionmark will be carrying out scheduled maintenance on Saturday 16th August between 09:00 and 21:00.

The purpose of the maintenance is to deploy system updates to ensure the ongoing reliability, security and up time of the platform.

During this time users may experience service disruption. We apologise for any inconvenience caused.

If you have any questions, please email digital-education@bristol.ac.uk

Questionmark upgrade, 4-7 August 2025

Questionmark will be upgrading our system on 4 August 2025. During the upgrade, the system will remain available but ‘at risk’. We don’t recommend using Questionmark between 4 and 7 August – this period will be used to carry out critical testing and configure the settings for the new system.

This upgrade introduces a new user interface for both staff and students, as well as enhanced functionality, including the Questionmark secure browser and Next Gen scheduling.

We apologise for any inconvenience this may cause. If you have any questions, please contact digital-education@bristol.ac.uk.

 

Accessibility and AI: Reflections from June 2025 Events

I attended two accessibility related events week beginning 23rd June. I was virtually at Nottingham University’s Digital Accessibility 2025 Conference on Wednesday 25th and on Friday 27th went to the JISC ‘AI and accessibility skills: building the accessibility professional of the future’ event in London.

JISC Event Highlights

Powerpoint title slide reading: 'A fundamental shift?' Asking critical questions on AI and skills in the accessibility workflow. Dr Sarah Lewthwaite Senior Research Fellow, UKRI Future Leaders Fellow JISC London June 2025 The JISC AI and Accessibility event was interesting; I went ready to have my scepticism about AI challenged. I’ve been at too many AI centered events where someone tells me with great confidence that ‘In the next year, AI will revolutionise [Insert whatever here]’ and all that seems to happen is there are less fingers in the AI generated photos and search engine AI results get things incorrect more confidently.

Ethical, environmental and quality issues with GenAI/LLMs aside, I’m not anti-AI but the way it’s being shoehorned into everything (whether it’s any good or not) is a terrible way to deploy a technology. While the AI you get on your new phone, TV, toaster or Office Suite is generally a substandard experience, I suspect when the current bubble bursts AI will mature into a solid tool in some fields, with coding and accessibility being strong candidates.

So it was marvellous to attend something without the snake oil; what was covered was measured and punctuated with discursive elements. It probably helped being sat on a table with some other Learning Technologists, someone from The Government Digital Service and a rep from W3C – we had a good group with a variety of accessibility experience and knowledge.

The event was led by Dr Sarah Lewthwaite with colleagues from the University of Southampton. We were introduced to their research into ‘Teaching Accessibility in the Digital Skill Set’. Their research covers investigating the teaching of digital accessibility in both academic and workplace settings. Themes such as recognition of disability as a site of expertise, engagement with models of disability, and challenging techno-ablism and bias were introduced.

I was particularly taken with the concept of ‘Questioning what is already known’; when The Digital Education Office ran “lived experience” events in collaboration with AbilityNet in 2019 I had some of my existing preconceptions challenged. Some stuff is just cemented into ‘what is known’ and understood, but disabled folk will often powerfully challenge and take down this received wisdom. It shouldn’t be revolutionary that we take measures to include them in discussion and development from the very start.

We were asked ‘Is this a moment of transformation?’ – a great question with a muddy answer from my point of view. With the Equality Act 2010, the Public Sector Bodies (mobile and websites) Accessibility Regulations 2018 and now the European Accessibility Act 2025 all creating requirements for Universities to ensure that our digital estate is accessible to everyone, it feels like the potential for AI to fix issues and make things actually accessible is massive. AI could mitigate the very real financial, reputational and workload issues we currently face.

We heard from Dimple Khagram, CEO of Purple Beard, about her thoughts on humanising AI so that it works for humans. Dimple had some interesting takes on what it takes to “Be the human AI can’t replace”. It was very much about collaboration rather than competition with AI. I still can’t get past the fact I’ve yet to save time with AI here for the majority of use cases. I can create images, edit audio, write words faster and more authentically than when I’ve attempted to with AI, when you factor in the significant editing and checking of AI generated content. But I will admit I am possibly not the target audience in this use case.

Accessibility consultant Tim Scannell spoke to us via two BSL Interpreters about AI attempts to create BSL translation software and how a good idea suffered from implementation – namely there hadn’t been any engagement with BSL users, so the tech focused on the fingerspelling and missed the additional context given from body language and facial expressions was missed. He questioned where the data used to create the robotic avatars had come from, it was missing some information and things like BSL slang and regional dialects were absent. The phrase ‘Nothing about us without us’ was used several times and I feel that nailed a central theme of the days topics.

The final presentation ‘Accessible by default?’ was from Dr Benjamin M. Gorman from Bournemouth University’s Department of Computing and Informatics; it focused on the skills we can’t afford to lose. The fact that GenAI needs to be explicitly told to make things accessible was highlighted, along with the fact humans need to have a deeper understanding of accessibility to be able to create the right prompts and pick up on things that cannot be codified – the visual accessibility of something for instance needs human level checking and editing. The phrase ‘No ethics. Just output.’ was used, which speaks volumes to me about my issues with having to heavily edit AI output to the point I find it faster and less frustrating to just do it from scratch.

The sessions were punctuated by discursive breakout sessions where we were prompted to answer questions such as ‘Can Accessibility function at the speed of AI?’ and ‘Can accessibility be automated?’

Nottingham Digital Accessibility 2025 – AI Highlights

Powerpoint slide showing an AI Image of a chicken chick made out of a citrus fruit, breaking out of the fruit peel. Text reads: What we talk about when we talk about AI. Current debates about AI are typlically about Generative AI (genAI). More "traditional" AI is used in man pattern recognition tools which are assistance tools, like spell check and voice recognition but the key difference is between facilitation and generation of content.

AI and Accessibility was featured heavily in Nottingham University’s Digital Accessibility 2025 Conference too. I saw some interesting work from Steve Wang at Nottingham around an AI driven tool which could take images of equations and convert them into accessible code for use in their Digital Learning Environment. Accessible equations is a Holy Grail of digital accessibility, so it’s great to see AI helping to make improvements.

The standout session for me at Nottingham’s conference was from Alice Bennet, from University of York’s library team who presented findings from a paper titled ‘AI and Accessibility: assistance, responsibilities and risks’. Alice framed the potential for AI and accessibility but also the risks and dangers of reliance on AI. A strong point was made of the need to unpick Generative AI/LLMs from AI which has been used in assistive tech for decades. She argued that an uncritical reliance on the technology over more traditional accommodations and adjustments could lead to negative outcomes. The loss of agency for disabled students as they are pushed towards AI solutions that might not be quite right really chimed with what Tim Scannell was talking about with the BSL interpretation software at the JISC event.

I’d recommend Alice’s AI and Accessibility: feature, the future or fad post on York’s Digital Accessibility blog as a piece of further reading on the subject.

‘Nothing about us without us’

I think AI has massive potential to lower many barriers, but we still need to keep a skilled human eye on this ‘solutionising’ and actually include people with lived experience of disability; there can’t ever be a one size fits all fix, it’s all too personal, too individual for that. AI could meet this, but steps need to be made to address this now.

I’ve worked with students who are disabled or require adjustments for almost 25 years in both Further and Higher Education and I’ve seen Ed Tech and Assistive Tech used to overcome the barriers they face, to varying degrees of success. ‘Nothing about us without us’ speaks volumes where technology is used to provide solutions. Disabled people need to be front and center of this. The rush to use AI in everything should not be any different.

Deletion of Re/Play content.

As part of our summer programme of work, we will be deleting the oldest lecture capture content from the Re/Play service. This content was recorded in the academic years 2015-16, 2016-17 and 2017-18. These recordings were made for students who have now completed their programmes of study and the majority of them have not been viewed in the last three years.

These recordings will be removed from the system on 1st September 2025. If you require any recordings from this period, please email digital-education@bristol.ac.uk with the unit code, title of the recording and date it took place, and we will transfer ownership.

This will not affect any recordings made by an individual or manually uploaded into the system. If you have any questions, please contact the Digital Education Office by emailing our team.

 

 

Ultra Update – July 2025

One of the advantages of our move to Blackboard Ultra is the ongoing potential for new feature releases and updates. New releases take place on the first Thursday of every month, which is much more frequently than for Blackboard Original. We will communicate about new features and changes via this blog. If you want to be kept up to date with the latest information, we recommend subscribing to the blog. 

This release has new or updated features in the following areas:   

  • Instructional design; 
  • Tests and assignments; 
  • Gradebook; 
  • Learner progression and personalised experience; 
  • Blackboard Core. 

These improvements impact instructors, students, and administrators. Below you’ll find our highlights, and more detail on the most relevant updates.

Highlights

Add captions to image blocks in Documents

Instructors can now add captions to image blocks in Documents. Added above or below an image, captions can provide context and aid understanding.

Screenshot showing the Edit File Options menu. The menu has Display Name and Image caption fields, and a caption position option. The above image position is selected.
Image 1. Instructors can use the Edit File Options menu to choose the position of the caption.
An image of a tabby cat with a caption above that reads 'Tabby is one of the most common fur patterns in cats'.
Image 2. An example of an image with a caption above.

Expanded activity page for instructors

The Activity page now has a courses section which shows new activity in current, open courses since an instructor last logged into Blackboard. The courses section includes shortcuts. These are numbered indicators for each course which instructors can select to view:

  • Items that need marking 
  • Messages
  • Students with alerts
A screenshot showing the activity page. The activity stream is on the right hand side, and the courses section is in the middle of the page. The top of the courses section reads 'Current courses' and there are four courses, with an image for each, displayed below it. On several of the courses, numbered purple indicator icons are shown.
Image 1. The new Activity page has a courses section as well as the activity stream.

Please see the Release Notes for further information.

Render mathematical formulas with MathJax

The formula rendering experience in the Content Editor has been enhanced through the implementation of MathJax, a powerful tool for displaying mathematical notation. This update improves the visual accuracy and consistency of LaTeX-based formulas, aligning them more closely with scientific and academic standards. 

MathJax offers a more precise rendering style preferred by many STEM instructors. When activated, MathJax will automatically render LaTeX code entered directly in the Content Editor across supported areas of Blackboard. MathJax in Ultra is currently only working with $$…$$ delimiters, not \(…\) or \[…\], but it will work for creating inline maths with the $$…$$ delimiters, not just standalone formulae. Please see this MathJax support page for information on Tex and LaTex support.

Other updates include:

Please see the Release notes for further information on these updates and others in this release.

Further help and support

For more guidance on using Blackboard Ultra, please explore Ultra Essential Guidance. 

Students can view an Ultra orientation and advice on using Ultra from their Help for Students sections (or equivalent) in your Blackboard course. 

Questionmark OnDemand Scheduled Maintenance 19 July 2025 09:00 – 15:00

Questionmark will be carrying out scheduled maintenance on Saturday 19 July between 09:00 and 15:00.

The purpose of the maintenance is to deploy system updates to ensure the ongoing reliability, security and up time of the platform.

During this time users may experience service disruption. We apologise for any inconvenience caused.

If you have any questions, please email digital-education@bristol.ac.uk