summaryrefslogtreecommitdiffstats
path: root/src/arrow/dev/README.md
blob: 258792b805a0b98378e6f14428d41369dfc9136f (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
<!--
  ~ Licensed to the Apache Software Foundation (ASF) under one
  ~ or more contributor license agreements.  See the NOTICE file
  ~ distributed with this work for additional information
  ~ regarding copyright ownership.  The ASF licenses this file
  ~ to you under the Apache License, Version 2.0 (the
  ~ "License"); you may not use this file except in compliance
  ~ with the License.  You may obtain a copy of the License at
  ~
  ~   http://www.apache.org/licenses/LICENSE-2.0
  ~
  ~ Unless required by applicable law or agreed to in writing,
  ~ software distributed under the License is distributed on an
  ~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
  ~ KIND, either express or implied.  See the License for the
  ~ specific language governing permissions and limitations
  ~ under the License.
  -->

# Arrow Developer Scripts

This directory contains scripts useful to developers when packaging,
testing, or committing to Arrow.

Merging a pull request requires being a committer on the project. In addition
you need to have linked your GitHub and ASF accounts on
https://gitbox.apache.org/setup/ to be able to push to GitHub as the main
remote.

NOTE: It may take some time (a few hours) between when you complete
the setup at GitBox, and when your GitHub account will be added as a
committer.

## How to merge a Pull request

Please don't merge PRs using the Github Web interface.  Instead, set up
your git clone such as to have a remote named ``apache`` pointing to the
official Arrow repository:
```
git remote add apache git@github.com:apache/arrow.git
```

and then run the following command:
```
./dev/merge_arrow_pr.sh
```

This creates a new Python virtual environment under `dev/.venv[PY_VERSION]`
and installs all the necessary dependencies to run the Arrow merge script.
After installed, it runs the merge script.

(we don't provide a wrapper script for Windows yet, so under Windows you'll
have to install Python dependencies yourself and then run `dev/merge_arrow_pr.py`
directly)

The merge script uses the GitHub REST API; if you encounter rate limit issues,
you may set a `ARROW_GITHUB_API_TOKEN` environment variable to use a Personal
Access Token.

You can specify the username and the password of your JIRA account in
`APACHE_JIRA_USERNAME` and `APACHE_JIRA_PASSWORD` environment variables.
If these aren't supplied, the script will ask you the values of them.

Note that the directory name of your Arrow git clone must be called `arrow`.

example output:
```
Which pull request would you like to merge? (e.g. 34):
```
Type the pull request number (from https://github.com/apache/arrow/pulls) and hit enter.
```
=== Pull Request #X ===
title	Blah Blah Blah
source	repo/branch
target	master
url	https://api.github.com/repos/apache/arrow/pulls/X

Proceed with merging pull request #3? (y/n):
```
If this looks good, type y and hit enter.
```
From git-wip-us.apache.org:/repos/asf/arrow.git
 * [new branch]      master     -> PR_TOOL_MERGE_PR_3_MASTER
Switched to branch 'PR_TOOL_MERGE_PR_3_MASTER'

Merge complete (local ref PR_TOOL_MERGE_PR_3_MASTER). Push to apache? (y/n):
```
A local branch with the merge has been created.
type y and hit enter to push it to apache master
```
Counting objects: 67, done.
Delta compression using up to 4 threads.
Compressing objects: 100% (26/26), done.
Writing objects: 100% (36/36), 5.32 KiB, done.
Total 36 (delta 17), reused 0 (delta 0)
To git-wip-us.apache.org:/repos/arrow-mr.git
   b767ac4..485658a  PR_TOOL_MERGE_PR_X_MASTER -> master
Restoring head pointer to b767ac4e
Note: checking out 'b767ac4e'.

You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by performing another checkout.

If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -b with the checkout command again. Example:

  git checkout -b new_branch_name

HEAD is now at b767ac4... Update README.md
Deleting local branch PR_TOOL_MERGE_PR_X
Deleting local branch PR_TOOL_MERGE_PR_X_MASTER
Pull request #X merged!
Merge hash: 485658a5

Would you like to pick 485658a5 into another branch? (y/n):
```
For now just say n as we have 1 branch

## Verifying Release Candidates

We have provided a script to assist with verifying release candidates:

```shell
bash dev/release/verify-release-candidate.sh 0.7.0 0
```

Currently this only works on Linux (patches to expand to macOS welcome!). Read
the script for information about system dependencies.

On Windows, we have a script that verifies C++ and Python (requires Visual
Studio 2015):

```
dev/release/verify-release-candidate.bat apache-arrow-0.7.0.tar.gz
```

### Verifying the JavaScript release

For JavaScript-specific releases, use a different verification script:

```shell
bash dev/release/js-verify-release-candidate.sh 0.7.0 0
```

# Integration testing

Build the following base image used by multiple tests:

```shell
docker build -t arrow_integration_xenial_base -f docker_common/Dockerfile.xenial.base .
```

## HDFS C++ / Python support

```shell
docker-compose build conda-cpp
docker-compose build conda-python
docker-compose build conda-python-hdfs
docker-compose run --rm conda-python-hdfs
```

## Apache Spark Integration Tests

Tests can be run to ensure that the current snapshot of Java and Python Arrow
works with Spark. This will run a docker image to build Arrow C++
and Python in a Conda environment, build and install Arrow Java to the local
Maven repository, build Spark with the new Arrow artifact, and run Arrow
related unit tests in Spark for Java and Python. Any errors will exit with a
non-zero value. To run, use the following command:

```shell
docker-compose build conda-cpp
docker-compose build conda-python
docker-compose build conda-python-spark
docker-compose run --rm conda-python-spark
```

If you already are building Spark, these commands will map your local Maven
repo to the image and save time by not having to download all dependencies.
Be aware, that docker write files as root, which can cause problems for maven
on the host.

```shell
docker-compose run --rm -v $HOME/.m2:/root/.m2 conda-python-spark
```

NOTE: If the Java API has breaking changes, a patched version of Spark might
need to be used to successfully build.