Sorry for the confusing wording on on the fly, actually I meant, as you mentioned, in comparison to what pt_to_tf.py did, to load the whole model then resave, convert them layer by layer. Have you alternatively tried running an optimized version of PyTorch for MAC (e.g., )? However, even if you have the model in TF's format, you may need to write your own BLOOM Tensorflow source code, because as far as I know, the current BLOOM source code in the Transformers repo has only a PyTorch version. This should be doable by creating a script that takes each of the 72 PyTorch bin files and converts them to TF checkpoints, one by one to avoid OOM. Perhaps a better option is to convert the entire model to TF's checkpoints. and, maybe, not sure if related, or even true claim, negative optimization on non-intel CPU to OpenMP framework by Intel Converting the checkpoints on the fly might be time consuming. I'm not sure if we are suffering from similar problem as I do think m1 share similar architecture to AMD processors in terms of CPU cache architectures and lack of AVX512 support. The closest I can get to is to force pt_to_tf.py to load the whole model then save it, however an SIGKILL (most likely OOM kill) kicked in.ĭo you have, by any chance, an idea how to properly convert the bin file back to tensorflow ckpt layer by layer on the fly like what you did on loading then inferencing Sorry if I deviated from the original topic as well. I tried to mildly modify both transformers/convert_pytorch_checkpoint_to_tf2.py and transformers/commands/pt_to_tf.py and take a chance either of them would work. I'm trying to mitigate this by adopting tensorflow-metal (by Apple) which presumably be more polished on macOS, however couldn't find an easy way to convert bloom pt back to tensorflow checkpoints. Hi I'm trying to infer bloom on my apple silicon Mac (20c 128G), however model runs extremely slow on CPU (60s/layer, seemingly not properly parallelized) nor mps backend working properly (outputs identical token for various inputs, 0.1s/layer though).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |