Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

zacbrld
/
MNLP_M3_document_encoder

Sentence Similarity
sentence-transformers
Safetensors
bert
feature-extraction
Generated from Trainer
dataset_size:42185
loss:TripletLoss
text-embeddings-inference
Model card Files Files and versions
xet
Community

Instructions to use zacbrld/MNLP_M3_document_encoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • sentence-transformers

    How to use zacbrld/MNLP_M3_document_encoder with sentence-transformers:

    from sentence_transformers import SentenceTransformer
    
    model = SentenceTransformer("zacbrld/MNLP_M3_document_encoder")
    
    sentences = [
        "For example, t ∈ { 0 , 1 , … , N } , N 0 ,  or {\\mbox{ or }}[0,+\\infty ).} Similarly, a filtered probability space (also known as a stochastic basis) ( Ω , F , { F t } t ≥ 0 , P ) {\\displaystyle \\left(\\Omega ,{\\mathcal {F}},\\left\\{{\\mathcal {F}}_{t}\\right\\}_{t\\geq 0},\\mathbb {P} \\right)} , is a probability space equipped with the filtration { F t } t ≥ 0 {\\displaystyle \\left\\{{\\mathcal {F}}_{t}\\right\\}_{t\\geq 0}} of its σ {\\displaystyle \\sigma } -algebra F {\\displaystyle {\\mathcal {F}}} . A filtered probability space is said to satisfy the usual conditions if it is complete (i.e., F 0 {\\displaystyle {\\mathcal {F}}_{0}} contains all P {\\displaystyle \\mathbb {P} } -null sets) and right-continuous (i.e. F t = F t + := ⋂ s > t F s {\\displaystyle {\\mathcal {F}}_{t}={\\mathcal {F}}_{t+}:=\\bigcap _{s>t}{\\mathcal {F}}_{s}} for all times t {\\displaystyle t} ).It is also useful (in the case of an unbounded index set) to define F ∞ {\\displaystyle {\\mathcal {F}}_{\\infty }} as the σ {\\displaystyle \\sigma } -algebra generated by the infinite union of the F t {\\displaystyle {\\mathcal {F}}_{t}} 's, which is contained in F {\\displaystyle {\\mathcal {F}}}: F ∞ = σ ( ⋃ t ≥ 0 F t ) ⊆ F .",
        "These individuals can experience these symptoms from failed attempts of depression like symptoms.Narcissistic personality disorder is characterized as feelings of superiority, a sense of grandiosity, exhibitionism, charming but also exploitive behaviors in the interpersonal domain, success, beauty, feelings of entitlement and a lack of empathy. Those with this disorder often engage in assertive self enhancement and antagonistic self protection. All of these factors can lead an individual with narcissistic personality disorder to manipulate others.",
        "{\\displaystyle {\\mathcal {F}}_{\\infty }=\\sigma \\left(\\bigcup _{t\\geq 0}{\\mathcal {F}}_{t}\\right)\\subseteq {\\mathcal {F}}.} A σ-algebra defines the set of events that can be measured, which in a probability context is equivalent to events that can be discriminated, or \"questions that can be answered at time t {\\displaystyle t} \". Therefore, a filtration is often used to represent the change in the set of events that can be measured, through gain or loss of information. A typical example is in mathematical finance, where a filtration represents the information available up to and including each time t {\\displaystyle t} , and is more and more precise (the set of measurable events is staying the same or increasing) as more information from the evolution of the stock price becomes available.",
        "Section: Structure and dynamics > Composition. Like microtubules, neurotubules are made up of protein polymers of α-tubulin and β-tubulin, globular proteins that are closely related. They join together to form a dimer, called tubulin. Neurotubules are generally assembled by 13 protofilaments which are polymerized from tubulin dimers. As a tubulin dimer consists of one α-tubulin and one β-tubulin, one end of the neurotubule is exposed with the α-tubulin and the other end with β-tubulin, these two ends contribute to the polarity of the neurotubule – the plus (+) end and the minus (-) end. The β-tubulin subunit is exposed on the plus (+) end. The two ends differ in their growth rate: plus (+) end is the fast-growing end while minus (-) end is the slow-growing end. Both ends have their own rate of polymerization and depolymerization of tubulin dimers, net polymerization causes the assembly of tubulin, hence the length of the neurotubules."
    ]
    embeddings = model.encode(sentences)
    
    similarities = model.similarity(embeddings, embeddings)
    print(similarities.shape)
    # [4, 4]
  • Notebooks
  • Google Colab
  • Kaggle
MNLP_M3_document_encoder
Ctrl+K
Ctrl+K
  • 1 contributor
History: 2 commits
zacbrld's picture
zacbrld
Add new SentenceTransformer model.
c90ed0d verified 11 months ago
  • 1_Pooling
    Add new SentenceTransformer model. 11 months ago
  • .gitattributes
    1.52 kB
    initial commit 11 months ago
  • README.md
    44.3 kB
    Add new SentenceTransformer model. 11 months ago
  • config.json
    617 Bytes
    Add new SentenceTransformer model. 11 months ago
  • config_sentence_transformers.json
    205 Bytes
    Add new SentenceTransformer model. 11 months ago
  • model.safetensors
    90.9 MB
    xet
    Add new SentenceTransformer model. 11 months ago
  • modules.json
    229 Bytes
    Add new SentenceTransformer model. 11 months ago
  • sentence_bert_config.json
    53 Bytes
    Add new SentenceTransformer model. 11 months ago
  • special_tokens_map.json
    695 Bytes
    Add new SentenceTransformer model. 11 months ago
  • tokenizer.json
    712 kB
    Add new SentenceTransformer model. 11 months ago
  • tokenizer_config.json
    1.46 kB
    Add new SentenceTransformer model. 11 months ago
  • vocab.txt
    232 kB
    Add new SentenceTransformer model. 11 months ago