Glad to help out! It actually blows my mind that whatever I wrote above actually made any sense, and that you made progress on my advise, or a step towards progress, at least.
You are viewing a single comment's thread from:
Glad to help out! It actually blows my mind that whatever I wrote above actually made any sense, and that you made progress on my advise, or a step towards progress, at least.
Keras == "very few lines of code"
Bookmarked this video a while back for future reference. When you mentioned the batch process issue, it reminded me of this video presentation.
Possible batch processing solution @17:32 with the model.fit() function.
I assume that model.fit() might be a wrapper for some heavy parallel routines that segment your training dataset rows into manageable chunks to be batch-processed within the gpu as kernel functions. Saves you writing the for loops as .fit() takes care of all the vectorization and/or compiled loops inside the gpu space. Python loops are tremendously slow, you can vectorize as much as possible with numpy arrays, but translating for loops into vectorized code is a little tricky.
Keras does all the grunt work.
Keras great for protoyping.
Keras == "very few lines of code"