health_multimodal.text.utils

Functions

get_bert_inference([bert_encoder_type])

Create a TextInferenceEngine for a text encoder model.

get_biovil_t_bert()

Load the BioViL-T Bert model and tokenizer from the Hugging Face Hub.

get_cxr_bert()

Load the CXR-BERT model and tokenizer from the Hugging Face Hub.

Classes

BertEncoderType(value)

An enumeration.

class health_multimodal.text.utils.BertEncoderType(value)[source]

An enumeration.

health_multimodal.text.utils.get_bert_inference(bert_encoder_type=BertEncoderType.BIOVIL_T_BERT)[source]

Create a TextInferenceEngine for a text encoder model.

Parameters

bert_encoder_type (BertEncoderType) – The type of text encoder model to use, CXR_BERT or BIOVIL_T_BERT.

The model is downloaded from the Hugging Face Hub. The engine can be used to get embeddings from text prompts or masked token predictions.

Return type

TextInferenceEngine

health_multimodal.text.utils.get_biovil_t_bert()[source]

Load the BioViL-T Bert model and tokenizer from the Hugging Face Hub.

Return type

Tuple[CXRBertTokenizer, CXRBertModel]

health_multimodal.text.utils.get_cxr_bert()[source]

Load the CXR-BERT model and tokenizer from the Hugging Face Hub.

Return type

Tuple[CXRBertTokenizer, CXRBertModel]