Skip to content

Conversation

@yookontian
Copy link

@yookontian yookontian commented Apr 2, 2023

Compared to "making random negative samples", maybe it's better to have the model learn about "if it's 1, then it can't be 0, 2, 3, 4, etc.".
I changed the method for generating negative samples. And it did work, with an improvement of 3.5% on the test dataset.
But since this will lead the size to 10x compared with the original train dataset. And my GPU can't handle the whole dataset trained at once, so I added batch training as commented codes in the main().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant