To use this model you need to have the node-llama-cpp module installed.
This can be installed using npm install -S node-llama-cpp and the minimum
version supported in version 2.0.0.
This also requires that have a locally built version of Llama2 installed.
To use this model you need to have the
node-llama-cpp
module installed. This can be installed usingnpm install -S node-llama-cpp
and the minimum version supported in version 2.0.0. This also requires that have a locally built version of Llama2 installed.