Skip to content

Commit

Permalink
Forego the offset loop that was meant to slide values around with ano…
Browse files Browse the repository at this point in the history
…ther unpack statement.
  • Loading branch information
SilverNexus committed Aug 5, 2018
1 parent 5bae81a commit 0275073
Showing 1 changed file with 3 additions and 8 deletions.
11 changes: 3 additions & 8 deletions train.lua
Original file line number Diff line number Diff line change
Expand Up @@ -287,14 +287,9 @@ function feval(x)
local doutput_t = clones.criterion[t]:backward(predictions[t], y[t])
drnn_state[t][#drnn_state[t]+1] = doutput_t
local dlst = clones.rnn[t]:backward({x[t], unpack(rnn_state[t-1])}, drnn_state[t])
drnn_state[t-1] = {}
for k,v in pairs(dlst) do
if k > 1 then -- k == 1 is gradient on x, which we dont need
-- note we do k-1 because first item is dembeddings, and then follow the
-- derivatives of the state, starting at index 2. I know...
drnn_state[t-1][k-1] = v
end
end
-- dlst[1] is the gradient on x, which we don't need
-- using unpack should slide the values into the correct indexes, allowing us to forego a loop.
drnn_state[t-1] = {unpack(dlst, 2)}
end
------------------------ misc ----------------------
-- transfer final state to initial state (BPTT)
Expand Down

0 comments on commit 0275073

Please sign in to comment.