theano - Is the use of 'givens' really necessary in the deeplearning tutorials? -


in deep learning tutorials, training data stored in shared array , index array passed training function slice out minibatch. understand allows data left in gpu memory, opposed passing small chunks of data parameter training function each minibatch. in previous questions, given answer why givens mechanism used in tutorials.

i don't yet see connection between these 2 concepts, i'm missing out on essential. far understand, givens mechanism swaps out variable in graph given symbolic expression (i.e., given subgraph inserted in place of variable). why not define computational graph way need in first place?

here minimal example. define shared variable x , integer index, , either create graph contains slicing operation, or create 1 slicing operation inserted post-hoc via givens. appearances, 2 resulting functions get_nogivens , get_tutorial identical (see debugprints @ end).

but why tutorials use givens pattern?

import numpy np import theano import theano.tensor t  x = theano.shared(np.arange(100),borrow=true,name='x') index = t.scalar(dtype='int32',name='index') x_slice = x[index:index+5]  get_tutorial = theano.function([index], x, givens={x: x[index:index+5]}, mode='debugmode') get_nogivens = theano.function([index], x_slice, mode='debugmode')    > theano.printing.debugprint(get_tutorial) deepcopyop [@a] ''   4  |subtensor{int32:int32:} [@b] ''   3    |x [@c]    |scalarfromtensor [@d] ''   0    | |index [@e]    |scalarfromtensor [@f] ''   2      |elemwise{add,no_inplace} [@g] ''   1        |tensorconstant{5} [@h]        |index [@e]  > theano.printing.debugprint(get_nogivens) deepcopyop [@a] ''   4  |subtensor{int32:int32:} [@b] ''   3    |x [@c]    |scalarfromtensor [@d] ''   0    | |index [@e]    |scalarfromtensor [@f] ''   2      |elemwise{add,no_inplace} [@g] ''   1        |tensorconstant{5} [@h]        |index [@e] 

they use givens here decouple actual data passed graph input data variable. explicitly replace input variable x[index * batch_size: (index + 1) * batch_size] little more messy.


Comments

Popular posts from this blog

php - Invalid Cofiguration - yii\base\InvalidConfigException - Yii2 -

How to show in django cms breadcrumbs full path? -

ruby on rails - npm error: tunneling socket could not be established, cause=connect ETIMEDOUT -