📢 Too many exams? Don’t know which one suits you best? Book Your Free Expert 👉 call Now!


    âš¡ Month End Offer - Flat 52% Off On All Courses! Enroll Now âš¡
    00:00:00 AM Left

    Question

    Which approach does BERT use for

    pre-training?
    A Skip-gram Correct Answer Incorrect Answer
    B Masked Language Model Correct Answer Incorrect Answer
    C CBOW Correct Answer Incorrect Answer
    D Autoencoding Correct Answer Incorrect Answer
    E Reinforcement Learning Correct Answer Incorrect Answer

    Solution

    BERT uses masked language modeling and next sentence prediction for pre-training.

    Practice Next
    ask-question