📢 Too many exams? Don’t know which one suits you best? Book Your Free Expert 👉 call Now!


    Question

    Which approach does BERT use for

    pre-training?
    A Skip-gram Correct Answer Incorrect Answer
    B Masked Language Model Correct Answer Incorrect Answer
    C CBOW Correct Answer Incorrect Answer
    D Autoencoding Correct Answer Incorrect Answer
    E Reinforcement Learning Correct Answer Incorrect Answer

    Solution

    BERT uses masked language modeling and next sentence prediction for pre-training.

    Practice Next
    ask-question