chainer_chemistry.models.RelGAT¶
-
class
chainer_chemistry.models.
RelGAT
(out_dim, hidden_dim=16, n_heads=3, negative_slope=0.2, n_edge_types=4, n_layers=4, dropout_ratio=-1.0, activation=<function identity>, n_atom_types=117, softmax_mode='across', concat_hidden=False, concat_heads=False, weight_tying=False)[source]¶ Relational Graph Attention Networks (GAT)
- See: Veličković, Petar, et al. (2017). Graph Attention Networks. arXiv:1701.10903 <https://arxiv.org/abs/1710.10903> Dan Busbridge, et al. (2018). Relational Graph Attention Networks
- <https://openreview.net/forum?id=Bklzkh0qFm>
Parameters: - out_dim (int) – dimension of output feature vector
- hidden_dim (int) – dimension of feature vector associated to each atom
- n_layers (int) – number of layers
- n_atom_types (int) – number of types of atoms
- n_heads (int) – number of multi-head-attentions.
- n_edge_types (int) – number of edge types.
- dropout_ratio (float) – dropout ratio of the normalized attention coefficients
- negative_slope (float) – LeakyRELU angle of the negative slope
- softmax_mode (str) – take the softmax over the logits ‘across’ or ‘within’ relation. If you would like to know the detail discussion, please refer Relational GAT paper.
- concat_hidden (bool) – If set to True, readout is executed in each layer and the result is concatenated
- concat_heads (bool) – Whether to concat or average multi-head attentions
- weight_tying (bool) – enable weight_tying or not
-
__init__
(out_dim, hidden_dim=16, n_heads=3, negative_slope=0.2, n_edge_types=4, n_layers=4, dropout_ratio=-1.0, activation=<function identity>, n_atom_types=117, softmax_mode='across', concat_hidden=False, concat_heads=False, weight_tying=False)[source]¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__
(out_dim[, hidden_dim, n_heads, …])Initialize self. add_hook
(hook[, name])Registers a link hook. add_link
(name, link)Registers a child link to this chain. add_param
(name[, shape, dtype, initializer])Registers a parameter to the link. add_persistent
(name, value)Registers a persistent value to the link. addgrads
(link)Accumulates gradient values from given link. children
()Returns a generator of all child links. cleargrads
()Clears all gradient arrays. copy
([mode])Copies the link hierarchy to new one. copyparams
(link[, copy_persistent])Copies all parameters from given link. count_params
()Counts the total number of parameters. delete_hook
(name)Unregisters the link hook. disable_update
()Disables update rules of all parameters under the link hierarchy. enable_update
()Enables update rules of all parameters under the link hierarchy. init_scope
()Creates an initialization scope. links
([skipself])Returns a generator of all links under the hierarchy. namedlinks
([skipself])Returns a generator of all (path, link) pairs under the hierarchy. namedparams
([include_uninit])Returns a generator of all (path, param) pairs under the hierarchy. params
([include_uninit])Returns a generator of all parameters under the link hierarchy. register_persistent
(name)Registers an attribute of a given name as a persistent value. repeat
(n_repeat[, mode])Repeats this link multiple times to make a Sequential
.serialize
(serializer)Serializes the link object. to_cpu
()Copies parameter variables and persistent values to CPU. to_gpu
([device])Copies parameter variables and persistent values to GPU. to_intel64
()Copies parameter variables and persistent values to CPU. zerograds
()Initializes all gradient arrays by zero. Attributes
local_link_hooks
Ordered dictionary of registered link hooks. update_enabled
True
if at least one parameter has an update rule enabled.within_init_scope
True if the current code is inside of an initialization scope. xp
Array module for this link.