Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fasttext: init at 0.2.0 #60781

Merged
merged 1 commit into from May 5, 2019
Merged

fasttext: init at 0.2.0 #60781

merged 1 commit into from May 5, 2019

Conversation

danieldk
Copy link
Contributor

@danieldk danieldk commented May 2, 2019

Motivation for this change

This change adds a derivation for fasttext. Fasttext is a tool + library for training word embeddings with subword units and text classification models. Fasttext is a popular tool in the computational linguistics and natural language processing communities.

Things done
  • Tested using sandboxing (nix.useSandbox on NixOS, or option sandbox in nix.conf on non-NixOS)
  • Built on platform(s)
    • NixOS
    • macOS
    • other Linux distributions
  • Tested via one or more NixOS test(s) if existing and applicable for the change (look inside nixos/tests)
  • Tested compilation of all pkgs that depend on this change using nix-shell -p nix-review --run "nix-review wip"
  • Tested execution of all binary files (usually in ./result/bin/)
  • Determined the impact on package closure size (by running nix path-info -S before and after)
  • Assured whether relevant documentation is up to date
  • Fits CONTRIBUTING.md.

@danieldk
Copy link
Contributor Author

danieldk commented May 2, 2019

@GrahamcOfBorg build fasttext

@c0bw3b
Copy link
Contributor

c0bw3b commented May 5, 2019

To check on Darwin:
@GrahamcOfBorg build fasttext

@c0bw3b c0bw3b merged commit 3513201 into NixOS:master May 5, 2019
@danieldk danieldk deleted the fasttext-0.2.0 branch June 4, 2019 05:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants