python (65.1k questions)
javascript (44.2k questions)
reactjs (22.7k questions)
java (20.8k questions)
c# (17.4k questions)
html (16.3k questions)
r (13.7k questions)
android (12.9k questions)
LSTM with changing batch size while training
I'm trying to build an LSTM on app-log data from different users. I have one big dataframe consisting of stacked app records of the users, so for example the first 1500 rows are for user 1, the follow...
Lilli Meier
Votes: 0
Answers: 1
Network stops learning once batchsize is set to > 1
I started switching from Keras to Pytorch and played around with some simple feedforward network today. It is supposed to learn the squaring operation, i.e. f(x) = x^2. However, my network only learns...
StrictlyStationaryPoster
Votes: 0
Answers: 1
`generator` yielded an element of shape (8, 0) where an element of shape (None,) was expected. Traceback (most recent call last):
I was training a network and I decided to add more data for training. my data set is selected from another data but both have (460,620,3) and Uint8 type. but when I train my net with this data, I got ...
amina
Votes: 0
Answers: 1
Batch size cannot be greater than 1 after using custom loss function
I have a custom loss function in LSTM. The model run wells while the batch size is 1 but give me error Input to reshape is a tensor with 2 values, but the requested shape has 1
[[{{node loss/Reshape}}...
Sudan pokharel
Votes: 0
Answers: 0