Eric sur Twitter : "...Introducing Direct Preference Optimization (DPO), a simple classification loss provably equivalent to RLHF"
Tags:
About This Document
File info
Documents with similar tags (experimental)