File size: 1,549 Bytes
692e1a3 df9d68e 692e1a3 df9d68e 692e1a3 df9d68e 119e85e 692e1a3 d3b356b 119e85e 692e1a3 df9d68e 692e1a3 df9d68e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
---
license: apache-2.0
tags:
- management
- text generation
model-index:
- name: ManaGPT-1010
results: []
language:
- en
pipeline_tag: text-generation
---
# ManaGPT-1010
<img style="float:right; margin:10px; margin-right:30px" src="https://huggingface.co/NeuraXenetica/ManaGPT-1010/resolve/main/ManaGPT_logo_01.png" width="150" height="150">
This model is a fine-tuned version of GPT-2 that has been trained on a custom dataset of scholarly and popular texts from the field of organizational management that relate to the emerging effects of posthumanizing technologies (e.g., relating to advanced artificial intelligence, social robotics, virtual reality, neuroprosthetics, and cyber-physical systems) on the structure of organizations and human beings’ experience of organizational life.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'ExponentialDecay', 'config': {'initial_learning_rate': 0.0005, 'decay_steps': 500, 'decay_rate': 0.95, 'staircase': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.27.1
- TensorFlow 2.11.0
- Datasets 2.10.1
- Tokenizers 0.13.2 |