Skip to content

Commit 88583a3

Browse files
authored
Adding documentation on DSP integration (#396)
* Adding documentation on DSP integration * Grammar * Dummy commit to test webhook * Another dummy commit to test trigger
1 parent 8ad6969 commit 88583a3

File tree

2 files changed

+114
-0
lines changed

2 files changed

+114
-0
lines changed

docs/ADVANCED.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,7 @@ Let's dive into the nitty-gritty of how to tweak the setup of your containerized
1717
* [Create custom configs](#create-custom-configs)
1818
* [Enable SmartStore](#enable-smartstore)
1919
* [Configure cache manager](#configure-cache-manager)
20+
* [Forward to Data Stream Processor](#forward-to-data-stream-processor)
2021
* [Use a deployment server](#use-a-deployment-server)
2122
* [Deploy distributed topology](#deploy-distributed-topology)
2223
* [Enable SSL internal communication](#enable-ssl-internal-communication)
@@ -325,6 +326,9 @@ splunk:
325326
...
326327
```
327328

329+
## Forward to Data Stream Processor
330+
See the [DSP integration document](advanced/DSP.md) to learn how to directly send data from a forwarder to [Splunk Data Stream Processor](https://www.splunk.com/en_us/software/stream-processing.html).
331+
328332
## Use a deployment server
329333
Deployment servers can be used to manage otherwise unclustered or disjoint Splunk instances. A primary use-case would be to stand up a deployment server to manage app or configuration distribution to a fleet of 100 universal forwarders.
330334

docs/advanced/DSP.md

Lines changed: 110 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,110 @@
1+
## Data Stream Processor
2+
[Splunk Data Stream Processor](https://www.splunk.com/en_us/software/stream-processing.html) is a separate service that can be used to collect and facilitate real-time stream processing. For more information, visit the [Splunk Data Stream Processor documentation](https://docs.splunk.com/Documentation/DSP).
3+
4+
The Splunk Docker image supports native integration with DSP through forwarders. Both universal and heavy forwarders can be automatically provisioned to send traffic to DSP, wherein custom pipelines can be configured to redirect and reformat the data as desired.
5+
6+
## Navigation
7+
8+
* [Forwarding traffic](#forwarding-traffic)
9+
* [User-generated certificates](#user-generated-certificates)
10+
* [Auto-generated certificates](#auto-generated-certificates)
11+
* [Defining pipelines ](#defining-pipelines)
12+
13+
## Forwarding Traffic
14+
Splunk DSP pipelines can be used to [process forwarder data](https://docs.splunk.com/Documentation/DSP/1.1.0/User/SenddataUF), either from a `splunk_universal_forwarder` or a `splunk_heavy_forwarder` role.
15+
16+
You will need [`scloud`](https://github.com/splunk/splunk-cloud-sdk-go) before proceeding.
17+
18+
### User-generated certificates
19+
In order to get data into DSP, you must generate a client certificate and register it to the DSP forwarder service. Instructions for this can be found [here](https://docs.splunk.com/Documentation/DSP/1.1.0/Data/Forwarder), or as follows:
20+
```bash
21+
$ openssl genrsa -out my_forwarder.key 2048
22+
$ openssl req -new -key "my_forwarder.key" -out "my_forwarder.csr" -subj "/C=US/ST=CA/O=my_organization/CN=my_forwarder/[email protected]"
23+
$ openssl x509 -req -days 730 -in "my_forwarder.csr" -signkey "my_forwarder.key" -out "my_forwarder.pem" -sha256
24+
$ cat my_forwarder.pem my_forwarder.key > my_forwarder-keys.pem
25+
$ scloud forwarders add-certificate --pem "$(<my_forwarder.pem)"
26+
```
27+
28+
Once you have the resulting `my_forwarder-keys.pem`, this can be mounted into the container and used immediately. Refer to the following `docker-compose.yml` example below:
29+
```yaml
30+
version: "3.6"
31+
32+
services:
33+
hf1:
34+
image: splunk/splunk:8.0.5
35+
hostname: hf1
36+
environment:
37+
- SPLUNK_ROLE=splunk_heavy_forwarder
38+
- SPLUNK_START_ARGS=--accept-license
39+
- SPLUNK_PASSWORD=helloworld
40+
- SPLUNK_DSP_ENABLE=true
41+
- SPLUNK_DSP_CERT=/opt/splunk/etc/auth/mycerts/my_forwarder-keys.pem
42+
- SPLUNK_DSP_SERVER=dsp-master-node.hostname:30001
43+
ports:
44+
- 8000
45+
- 8089
46+
volumes:
47+
- ./my_forwarder-keys.pem:/opt/splunk/etc/auth/mycerts/my_forwarder-keys.pem
48+
```
49+
50+
Alternatively, this can also be done using the `default.yml` as so:
51+
```yaml
52+
---
53+
splunk:
54+
dsp:
55+
enable: True
56+
server: dsp-master-node.hostname:30001
57+
cert: /opt/splunk/etc/auth/mycerts/my_forwarder-keys.pem
58+
...
59+
```
60+
61+
### Auto-generated Certificates
62+
If you're just getting your feet wet with DSP and these Docker images, it can be helpful to rely on the Docker image to generate the certificates for you. Using `SPLUNK_DSP_CERT=auto` or `splunk.dsp.cert: auto` will let the container to create the certificate and print it out through the container's logs for you to register yourself:
63+
```bash
64+
$ scloud forwarders add-certificate --pem "<copied from cert printed to container stdout>"
65+
```
66+
67+
## Defining Pipelines
68+
In addition to native support for sending data, the Docker image is also capable of configuring the pipeline in DSP which can be useful in declaratively defining the full end-to-end parsing and ingest
69+
70+
You will need [`scloud`](https://github.com/splunk/splunk-cloud-sdk-go) before proceeding. In addition, you'll need an `scloud.toml` and `.scloud_context` with permissions enabled to read/write to your DSP installation.
71+
72+
Pipeline specifications are defined using [SPL2](https://docs.splunk.com/Documentation/DSP/1.1.0/User/SPL2). Refer to the following `docker-compose.yml` example below:
73+
```yaml
74+
version: "3.6"
75+
76+
services:
77+
hf1:
78+
image: splunk/splunk:8.0.5
79+
hostname: hf1
80+
environment:
81+
- SPLUNK_ROLE=splunk_heavy_forwarder
82+
- SPLUNK_START_ARGS=--accept-license
83+
- SPLUNK_PASSWORD=helloworld
84+
- SPLUNK_DSP_ENABLE=true
85+
- SPLUNK_DSP_CERT=auto
86+
- SPLUNK_DSP_SERVER=dsp-master-node.hostname:30001
87+
- SPLUNK_DSP_PIPELINE_NAME=ingest-example
88+
- SPLUNK_DSP_PIPELINE_DESC="Demo using forwarders as source"
89+
- SPLUNK_DSP_PIPELINE_SPEC='| from receive_from_forwarders("forwarders:all") | into index("", "main");'
90+
ports:
91+
- 8000
92+
- 8089
93+
volumes:
94+
- ./.scloud.toml:/home/splunk/.scloud.toml
95+
- ./.scloud_context:/home/splunk/.scloud_context
96+
```
97+
98+
Alternatively, this can also be done using the `default.yml` as so:
99+
```yaml
100+
---
101+
splunk:
102+
dsp:
103+
enable: True
104+
server: dsp-master-node.hostname:30001
105+
cert: auto
106+
pipeline_name: ingest-example
107+
pipeline_desc: "Demo using forwarders as source"
108+
pipeline_spec: '| from receive_from_forwarders("forwarders:all") | into index("", "main");'
109+
...
110+
```

0 commit comments

Comments
 (0)