You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A graph neural network library for Julia based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl).
9
-
Its most relevant features are:
10
-
*Provides CUDA support.
11
-
*It's integrated with the JuliaGraphs ecosystem.
12
-
*Implements many common graph convolutional layers.
13
-
*Performs fast operations on batched graphs.
14
-
*Makes it easy to define custom graph convolutional layers.
8
+
A graph neural network library for Julia based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl). Its features include:
9
+
10
+
*Integratation with the JuliaGraphs ecosystem.
11
+
*Implementation of common graph convolutional layers.
12
+
*Fast operations on batched graphs.
13
+
*Easy to define custom layers.
14
+
*CUDA support.
15
15
16
16
## Installation
17
17
@@ -28,4 +28,4 @@ Usage examples can be found in the [examples](https://github.com/CarloLucibello/
28
28
29
29
## Acknowledgements
30
30
31
-
A big thank you goes to @yuehhua for creating [GeometricFlux.jl](https://github.com/FluxML/GeometricFlux.jl) of which GraphNeuralNetworks.jl is a radical redesign.
31
+
A big thanks goes to @yuehhua for creating [GeometricFlux.jl](https://github.com/FluxML/GeometricFlux.jl) of which GraphNeuralNetworks.jl is a radical redesign.
Copy file name to clipboardExpand all lines: docs/src/index.md
+20-17Lines changed: 20 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,26 +2,29 @@
2
2
3
3
This is the documentation page for the [GraphNeuralNetworks.jl](https://github.com/CarloLucibello/GraphNeuralNetworks.jl) library.
4
4
5
-
A graph neural network library for Julia based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl).
6
-
Its most relevant features are:
7
-
* Provides CUDA support.
8
-
* It's integrated with the JuliaGraphs ecosystem.
9
-
* Implements many common graph convolutional layers.
10
-
* Performs fast operations on batched graphs.
11
-
* Makes it easy to define custom graph convolutional layers.
5
+
A graph neural network library for Julia based on the deep learning framework [Flux.jl](https://github.com/FluxML/Flux.jl). GNN.jl is largely inspired by python's libraries [PyTorch Geometric](https://pytorch-geometric.readthedocs.io/en/latest/) and [Deep Graph Library](https://docs.dgl.ai/),
6
+
and by julia's [GeometricFlux](https://fluxml.ai/GeometricFlux.jl/stable/).
7
+
8
+
Among its features:
9
+
10
+
* Integratation with the JuliaGraphs ecosystem.
11
+
* Implementation of common graph convolutional layers.
12
+
* Fast operations on batched graphs.
13
+
* Easy to define custom layers.
14
+
* CUDA support.
12
15
13
16
14
17
## Package overview
15
18
16
-
Let's give a brief overview of the package solving a
17
-
graph regression problem on fake data.
19
+
Let's give a brief overview of the package by solving a
20
+
graph regression problem with synthetic data.
18
21
19
22
Usage examples on real datasets can be found in the [examples](https://github.com/CarloLucibello/GraphNeuralNetworks.jl/tree/master/examples) folder.
20
23
21
24
### Data preparation
22
25
23
26
First, we create our dataset consisting in multiple random graphs and associated data features.
24
-
that we batch together into a unique graph.
27
+
Then we batch the graphs together into a unique graph.
function (l::GCN)(g::GNNGraph, x::AbstractMatrix{T}) where T
56
-
c =1./sqrt.(degree(g, T, dir=:in))
57
-
x = x .* c'
58
-
x, _ =propagate(l, g, +, x)
59
-
x = x .* c'
60
-
return l.σ.(x .+ l.bias)
70
+
@assertsize(x, 2) == g.num_nodes
71
+
72
+
# Computes messages from source/neighbour nodes (j) to target/root nodes (i).
73
+
# The message function will have to handle matrices of size (*, num_edges).
74
+
# In this simple case we just let the neighbor features go through.
75
+
message(xi, xj, e) = xj
76
+
77
+
# The + operator gives the sum aggregation.
78
+
# `mean`, `max`, `min`, and `*` are other possibilities.
79
+
x =propagate(message, g, +, xj=x)
80
+
81
+
return l.σ.(l.weight * x .+ l.bias)
61
82
end
62
83
```
63
84
64
85
See the [`GATConv`](@ref) implementation [here](https://github.com/CarloLucibello/GraphNeuralNetworks.jl/blob/master/src/layers/conv.jl) for a more complex example.
86
+
87
+
88
+
## Built-in message functions
89
+
90
+
In order to exploit optimized specializations of the [`propagate`](@ref), it is recommended
91
+
to use built-in message functions such as [`copyxj`](@ref) whenever possible.
0 commit comments