For discussions related to modeling, machine learning and deep learning. Related packages include caret
, modelr
, yardstick
, rsample
, parsnip
, tensorflow
, keras
, cloudml
, and tfestimators
.
I am trying to run the example that is in this link: LSTM Network in R | R-bloggers
I am with problems with the following cods I can move on at all, I tried every thing to fix the error but it keep up.
library(keras)
library(tensorflow)
use_condaenv("keras-tf", required = T)
imbd <- dataset_imdb(num_words = 500)
Erro: Valid installation of TensorFlow not found.
Python environments searched for 'tensorflow' package:
/home/hidro/miniconda3/envs/keras-tf/bin/python3.11
Python exception encountered:
Traceback (most recent call last):
File "/home/hidro/R/x86_64-pc-linux-gnu-library/4.3/reticulate/python/rpytools/loader.py", line 122, in _find_and_load_hook
return _run_hook(name, _hook)
^^^^^^^^^^^^^^^^^^^^^^
File "/home/hidro/R/x86_64-pc-linux-gnu-library/4.3/reticulate/python/rpytools/loader.py", line 96, in _run_hook
module = hook()
^^^^^^
File "/home/hidro/R/x86_64-pc-linux-gnu-library/4.3/reticulate/python/rpytools/loader.py", line 120, in _hook
return find_and_load(name, import)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hidro/miniconda3/envs/keras-tf/lib/python3.11/site-packages/tensorflow/init.py", line 37, in
from tensorflow.python.tools import module_util as _module_util
File "/home/hidro/R/x86_64-pc-linux-gnu-l
Then I went to this link: Deep Learning with R, Second Edition
and ran the codes till this one and give me the following errors:
int_sequence <- seq(10)
Now you can create your dataset
dummy_dataset <- timeseries_dataset_from_array(
- data = head(int_sequence, -3),
- targets = tail(int_sequence, -3),
- sequence_length = 3,
- batch_size = 2
- )
Error in if (tf_version() < ver) stop("Tensorflow version >=", ver, " required to use ", :
argumento tem comprimento zero
If there's a method to resolve the installation errors for the packages so I can proceed with running the code and completing my work, I would greatly appreciate any assistance provided. Thank you!