training ~

#1
by LeroyDyer - opened

i tried to train my model with this datset !!!

An i got 0.0092 LOL !! Whop whop ! it was overfitting the exact data well !
Over time i pushed less and less parameters ! and still got exact matches :

when you over fit the model it will return the exact infor your looking for and in this case the bible we dont want any hallucenations ! ..

SO :
when you train the model on multiple different versions of the bible the model will begi to choose its easest return : so to favour a particular bible you need to over fit it ! more epochs ! 31k is the sentences ! i also have the same i multi lingual which is exactly the same length ! ::

my personal bible data set i chunked instead ! and used as a pretrain style dumping large section in ! later i used the q/a technique !

Sign up or log in to comment