Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Commit c046426

Browse filesBrowse files
michaelawyuJon Wayne Parrott
authored andcommitted
Fixed Failed Kokoro Test (Dataproc) (GoogleCloudPlatform#1203)
* Fixed Failed Kokoro Test (Dataproc) * Fixed Lint Error * Update dataproc_e2e_test.py * Update dataproc_e2e_test.py * Fixing More Lint Errors * Fixed b/65407087 * Revert "Merge branch 'master' of https://github.com/michaelawyu/python-docs-samples" This reverts commit 1614c7d, reversing changes made to cd1dbfd. * Revert "Fixed b/65407087" This reverts commit cd1dbfd. * Fixed Lint Error * Fixed Lint Error
1 parent 567ef35 commit c046426
Copy full SHA for c046426

File tree

Expand file treeCollapse file tree

2 files changed

+11
-6
lines changed
Filter options
Expand file treeCollapse file tree

2 files changed

+11
-6
lines changed

‎dataproc/dataproc_e2e_test.py

Copy file name to clipboardExpand all lines: dataproc/dataproc_e2e_test.py
+1-4Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,17 +18,14 @@
1818

1919
import os
2020

21-
from gcp_devrel.testing.flaky import flaky
22-
2321
import submit_job_to_cluster
2422

2523
PROJECT = os.environ['GCLOUD_PROJECT']
2624
BUCKET = os.environ['CLOUD_STORAGE_BUCKET']
27-
CLUSTER_NAME = 'testcluster2'
25+
CLUSTER_NAME = 'testcluster3'
2826
ZONE = 'us-central1-b'
2927

3028

31-
@flaky
3229
def test_e2e():
3330
output = submit_job_to_cluster.main(
3431
PROJECT, ZONE, CLUSTER_NAME, BUCKET)

‎dataproc/submit_job_to_cluster.py

Copy file name to clipboardExpand all lines: dataproc/submit_job_to_cluster.py
+10-2Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,12 +25,12 @@
2525
def get_default_pyspark_file():
2626
"""Gets the PySpark file from this directory"""
2727
current_dir = os.path.dirname(os.path.abspath(__file__))
28-
f = open(os.path.join(current_dir, DEFAULT_FILENAME), 'r')
28+
f = open(os.path.join(current_dir, DEFAULT_FILENAME), 'rb')
2929
return f, DEFAULT_FILENAME
3030

3131

3232
def get_pyspark_file(filename):
33-
f = open(filename, 'r')
33+
f = open(filename, 'rb')
3434
return f, os.path.basename(filename)
3535

3636

@@ -76,6 +76,14 @@ def create_cluster(dataproc, project, zone, region, cluster_name):
7676
'config': {
7777
'gceClusterConfig': {
7878
'zoneUri': zone_uri
79+
},
80+
'masterConfig': {
81+
'numInstances': 1,
82+
'machineTypeUri': 'n1-standard-1'
83+
},
84+
'workerConfig': {
85+
'numInstances': 2,
86+
'machineTypeUri': 'n1-standard-1'
7987
}
8088
}
8189
}

0 commit comments

Comments
0 (0)
Morty Proxy This is a proxified and sanitized view of the page, visit original site.