-
Notifications
You must be signed in to change notification settings - Fork 141
[GR-71934] Run Python gates on Github #574
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
Ariouz
wants to merge
96
commits into
oracle:master
Choose a base branch
from
Ariouz:vcalvez/github_ci
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
96 commits
Select commit
Hold shift + click to select a range
540b03b
Forked from oracle/graalpython - Initial Commit with ci
Ariouz c53861f
Fix set-export / unpack-artifact permissions
Ariouz 445be61
Disabled OOM/Buff Overflow tests
Ariouz 68c5dd4
Add phase shift warmup benchmark.
jchalou 6781f3a
Use succinct option format.
jchalou aad9f30
Add documentation.
jchalou ff9f041
Process jobs "Downloads" and extract 'key' to env
Ariouz ff8d4c2
Fix PATH on ci
Ariouz 4165828
Retag bytecode DSL tests
steve-s 1d4d9bd
Retag more Bytecode DSL tests
steve-s bda8238
Retag more Bytecode DSL tests
steve-s 008a521
Fix test_custom_iterator_return
steve-s 3c75c08
Retag more Bytecode DSL tests
steve-s 2d81110
Untag test failing in CI although it passes locally
steve-s dd6ed4d
Ignore some flaky ZoneInfoTest tests
steve-s ed34a7f
Remove transiently timing out test_threading:test_print_exception_gh_…
steve-s 7156473
Update TCK provider to communicate that subscripting works with types…
timfel 614748b
Fix PATH error on Linux/MacOS + Update imports
Ariouz a494af4
Disable Linux aarch64 svm build
Ariouz b742111
Construct download links using string format + Disable linux aarch64 …
Ariouz 9f55132
Split python-unittest-retagger to 8 batches like python-tagged-unittests
Ariouz ed53829
Build: disable -g and enable -Ob while GITHUB_CI env is set
Ariouz 0706bfe
Windows set-export / unpack-artifact scripts
Ariouz 927f2e1
Add env: ${{ matrix.env }} to tier2 and 3 jobs
Ariouz 59639ed
Add build arg -Ob on libpythonvm library build
Ariouz fce92cc
Add -J-XX:MaxRAMPercentage=90.0 on graalpy native / libpythonvm builds
Ariouz edabea7
upload report.json artifact
Ariouz f9dc6f5
test: python 3.12.8 on linux aarch64
Ariouz 1cb5373
GItHub CI: Run linux amd64+aarch64 retagger weekly jobs and enable au…
Ariouz d4ef0c7
Update imports
Ariouz ee03bd3
WIP: Github tagged test exclusion character
Ariouz 2e4bd61
revert aarch64 retag
Ariouz 989cd60
Revert "Applied retagger diff for linux aarch64 tests"
Ariouz 9de50d4
Revert "Applied diff with 'retagg failed' to true"
Ariouz bb981c3
Revert "Apply diff_report generated by retagger jobs"
Ariouz f635124
Update imports
Ariouz 2b08d72
Disable PR test runs while in draft
Ariouz 69d1581
Test: Apply new retagger exclusion character for github jobs
Ariouz 8bd5752
Apply linux-amd64 new github exclusion test tag
Ariouz 73c0cd2
wip: ci-unittest-retagger workflow file
Ariouz a53b747
Fix workflow
Ariouz a56e6f7
Use ci-matrix-gen to call retagger and run retagger merge script
Ariouz 9988e22
Test MX_REPORT_SUFFIX with os-arch
Ariouz 7b08e03
Use CURRENT_PLATFORM as mx report suffix while running in GITHUB_CI
Ariouz 598935e
Tmp workflow retagger filter
Ariouz c961ad8
Extract PR test to a single workflow
Ariouz c1502fb
rm redundant check
Ariouz 10ce536
PR workflow, rename job
Ariouz 749d0e0
Allow retagger fiter for windows
Ariouz bfda686
Apply Linux aarch64 retags - fix retagger merge
Ariouz 623c2f7
Fix Windows build
Ariouz 2a09b26
unpack-artifact.cmd
Ariouz b88861e
Run retagger on Linux aarch64
Ariouz 4e3248d
Run PR tests on synchronize
Ariouz c0f2511
Fix Windows arch name in retagger merge
Ariouz bcb63e9
disable tests on draft
Ariouz 3576d43
Clean some debugs
Ariouz 01354dc
Fix GITHUB_CI not set on merge job
Ariouz bf1ef89
Rm redundant check
Ariouz 82afb67
Set build allocated memory relative system's available
Ariouz 230f7e5
Open PR on weekly retagger (#8)
Ariouz e607a53
Add retagger label to PR
Ariouz 6bd5d2a
Remove debugs
Ariouz 908afa8
Merge oracle/graalpython master and fixes issues
Ariouz 0aadcf2
Enable Windows in ci-unittest workflow
Ariouz 9d17178
Update PR unittest trigger
Ariouz 0df91d5
Rm debugs, fix github_ci_build_args not used
Ariouz 5e1461d
Apply retags for linux-x86_64
8ce1883
Apply retags for win32-AMD64
4fbdefc
Update imports
Ariouz f3f43f1
Revert 'update imports', mx labsjdk version isn't up to date yet.
Ariouz 9ecb978
Fix psutil package removed by mx update imports
Ariouz be4379c
test_repr_deep broken after mx update ?
Ariouz 72adc6c
Fix choco / maven in PATH
Ariouz 9de5ec0
Remove comments
Ariouz 5a28761
Fix tagged test platform sort order
Ariouz b07efff
Add retagger workflow input
Ariouz 6381b6e
Revert retagger gate to weekly, allow weekly in matrix
Ariouz eca2884
Fix .github/scripts/ path in github path
Ariouz bb9345a
Run weekly jobs only in tier1 otherwise it would run multiple times
Ariouz b5d9b6f
Build standalone artifacts before unittest/retags to avoid re-buildin…
Ariouz 62b58b4
Handle case when no tag has changed
Ariouz 0c708df
tmp: Add debug to retagger
Ariouz 22ee071
Excluce 'bench' from pr unittest
Ariouz d591ea3
Revert mx update in ci/graal/common.jsonnet
Ariouz cd9a093
Update msbuild action / graal versions
Ariouz 4b93051
Fix empty weekly retagger branch
Ariouz 6104bd6
Rebase ci/graal on master
Ariouz 7031498
Update microsoft/setup-msbuild version, fix empty retagger branch, fi…
Ariouz 469adc5
Override linux aarch64 python version
Ariouz 1391889
Override linux aarch64 python version
Ariouz f311b70
Fix environment variables (#21)
Ariouz 8fd96dd
{platform}-github tags (#22)
Ariouz 95a72e4
Remove unused function
Ariouz 1c7168d
Merge branch 'master' into vcalvez/github_ci
Ariouz 720d69d
Fix path executables not found on windows
Ariouz File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,324 @@ | ||
| #!/usr/bin/env python3 | ||
| import argparse | ||
| import fnmatch | ||
| import json | ||
| import os | ||
| import re | ||
| import shlex | ||
| import subprocess | ||
| import sys | ||
|
|
||
| from dataclasses import dataclass | ||
| from functools import cached_property, total_ordering | ||
| from typing import Any | ||
|
|
||
| DEFAULT_ENV = { | ||
| "CI": "true", | ||
| "PYTHONIOENCODING": "utf-8", | ||
| "GITHUB_CI": "true" | ||
| } | ||
|
|
||
| # If any of these terms are in the job json, they do not run in public | ||
| # infrastructure | ||
| JOB_EXCLUSION_TERMS = ( | ||
| "enterprise", | ||
| "corporate-compliance", | ||
|
|
||
| # Jobs failing in GitHub Actions:buffer overflow, out of memory | ||
| "python-svm-unittest", | ||
| "cpython-gate", | ||
|
|
||
| "darwin", | ||
| ) | ||
|
|
||
| DOWNLOADS_LINKS = { | ||
| "GRADLE_JAVA_HOME": "https://download.oracle.com/java/{major_version}/latest/jdk-{major_version}_{os}-{arch_short}_bin{ext}" | ||
| } | ||
|
|
||
| # Gitlab Runners OSS | ||
| OSS = { | ||
| "macos-latest": ["darwin", "aarch64"], | ||
| "ubuntu-latest": ["linux", "amd64"], | ||
| "ubuntu-24.04-arm": ["linux", "aarch64"], | ||
| "windows-latest": ["windows", "amd64"] | ||
| } | ||
|
|
||
| # Override unavailable Python versions for some OS/Arch combinations | ||
| PYTHON_VERSIONS = { | ||
| "ubuntu-24.04-arm": "3.12.8", | ||
| } | ||
|
|
||
|
|
||
| @dataclass | ||
| class Artifact: | ||
| name: str | ||
| pattern: str | ||
|
|
||
|
|
||
| @total_ordering | ||
| class Job: | ||
| def __init__(self, job: dict[str, Any]): | ||
| self.job = job | ||
|
|
||
| @cached_property | ||
| def runs_on(self) -> str: | ||
| capabilities = self.job.get("capabilities", []) | ||
|
|
||
| for os, caps in OSS.items(): | ||
| if all(required in capabilities for required in caps): return os | ||
|
|
||
| return "ubuntu-latest" | ||
|
|
||
| @cached_property | ||
| def name(self) -> str: | ||
| return self.job["name"] | ||
|
|
||
| @cached_property | ||
| def targets(self) -> list[str]: | ||
| return self.job.get("targets", []) | ||
|
|
||
| @cached_property | ||
| def env(self) -> dict[str, str]: | ||
| return self.job.get("environment", {}) | DEFAULT_ENV | ||
|
|
||
| @cached_property | ||
| def mx_version(self) -> str | None: | ||
| for k, v in self.job.get("packages", {}).items(): | ||
| if k == "mx": | ||
| return v.strip("=<>~") | ||
|
|
||
| @cached_property | ||
| def python_version(self) -> str | None: | ||
| python_version = None | ||
| for k, v in self.job.get("packages", {}).items(): | ||
| if k == "python3": | ||
| python_version = v.strip("=<>~") | ||
| for k, v in self.job.get("downloads", {}).items(): | ||
| if k == "PYTHON3_HOME": | ||
| python_version = v.get("version", python_version) | ||
| if "MX_PYTHON" in self.env: | ||
| del self.env["MX_PYTHON"] | ||
| if "MX_PYTHON_VERSION" in self.env: | ||
| del self.env["MX_PYTHON_VERSION"] | ||
|
|
||
| if self.runs_on in PYTHON_VERSIONS: | ||
| python_version = PYTHON_VERSIONS[self.runs_on] | ||
| return python_version | ||
|
|
||
| @cached_property | ||
| def system_packages(self) -> list[str]: | ||
| # TODO: support more packages | ||
| system_packages = [] | ||
| for k, _ in self.job.get("packages", {}).items(): | ||
| if k in ["mx", "python3"]: | ||
| continue | ||
| if k.startswith("pip:"): | ||
| continue | ||
| elif k.startswith("00:") or k.startswith("01:"): | ||
| k = k[3:] | ||
| system_packages.append(f"'{k}'" if self.runs_on != "windows-latest" else f"{k}") | ||
| return system_packages | ||
|
|
||
| @cached_property | ||
| def python_packages(self) -> list[str]: | ||
| python_packages = [] | ||
| for k, v in self.job.get("packages", {}).items(): | ||
| if k.startswith("pip:"): | ||
| python_packages.append(f"'{k[4:]}{v}'" if self.runs_on != "windows-latest" else f"{k[4:]}{v}") | ||
| return python_packages | ||
|
|
||
| def get_download_steps(self, key: str, version: str) -> str: | ||
| download_link = self.get_download_link(key, version) | ||
| filename = download_link.split('/')[-1] | ||
|
|
||
| if self.runs_on == "windows-latest": | ||
| return (f""" | ||
| Invoke-WebRequest -Uri {download_link} -OutFile {filename} | ||
| $dirname = (& tar -tzf {filename} | Select-Object -First 1).Split('/')[0] | ||
| tar -xzf {filename} | ||
| Add-Content $env:GITHUB_ENV "{key}=$(Resolve-Path $dirname)" | ||
| """) | ||
|
|
||
| return (f"wget -q {download_link} && " | ||
| f"dirname=$(tar -tzf {filename} | head -1 | cut -f1 -d '/') && " | ||
| f"tar -xzf {filename} && " | ||
| f'echo {key}=$(realpath "$dirname") >> $GITHUB_ENV') | ||
|
|
||
|
|
||
| def get_download_link(self, key: str, version: str) -> str: | ||
| os, arch = OSS[self.runs_on] | ||
| major_version = version.split(".")[0] | ||
| extension = ".tar.gz" if not os == "windows" else ".zip" | ||
| os = os if os != "darwin" else "macos" | ||
| arch_short = {"amd64": "x64", "aarch64": "aarch64"}[arch] | ||
|
|
||
| vars = { | ||
| "major_version": major_version, | ||
| "os":os, | ||
| "arch": arch, | ||
| "arch_short": arch_short, | ||
| "ext": extension, | ||
| } | ||
|
|
||
| return DOWNLOADS_LINKS[key].format(**vars) | ||
|
|
||
| @cached_property | ||
| def downloads(self) -> list[str]: | ||
| downloads = [] | ||
| for k, download_info in self.job.get("downloads", {}).items(): | ||
| if k in DOWNLOADS_LINKS and download_info["version"]: | ||
| downloads.append(self.get_download_steps(k, download_info["version"])) | ||
|
|
||
| return downloads | ||
|
|
||
| @staticmethod | ||
| def common_glob(strings: list[str]) -> str: | ||
| assert strings | ||
| if len(strings) == 1: | ||
| return strings[0] | ||
| prefix = strings[0] | ||
| for s in strings[1:]: | ||
| i = 0 | ||
| while i < len(prefix) and i < len(s) and prefix[i] == s[i]: | ||
| i += 1 | ||
| prefix = prefix[:i] | ||
| if not prefix: | ||
| break | ||
| suffix = strings[0][len(prefix):] | ||
| for s in strings[1:]: | ||
| i = 1 | ||
| while i <= len(suffix) and i <= len(s) and suffix[-i] == s[-i]: | ||
| i += 1 | ||
| if i == 1: | ||
| suffix = "" | ||
| break | ||
| suffix = suffix[-(i-1):] | ||
| return f"{prefix}*{suffix}" | ||
|
|
||
| @cached_property | ||
| def upload_artifact(self) -> Artifact | None: | ||
| if artifacts := self.job.get("publishArtifacts", []): | ||
| assert len(artifacts) == 1 | ||
| dir = artifacts[0].get("dir", ".") | ||
| patterns = artifacts[0].get("patterns", ["*"]) | ||
| return Artifact( | ||
| artifacts[0]["name"], | ||
| " ".join([os.path.normpath(os.path.join(dir, p)) for p in patterns]) | ||
| ) | ||
| return None | ||
|
|
||
| @cached_property | ||
| def download_artifact(self) -> Artifact | None: | ||
| if artifacts := self.job.get("requireArtifacts", []): | ||
| pattern = self.common_glob([a["name"] for a in artifacts]) | ||
| return Artifact(pattern, os.path.normpath(artifacts[0].get("dir", "."))) | ||
| return None | ||
|
|
||
|
|
||
| @staticmethod | ||
| def flatten_command(args: list[str | list[str]]) -> list[str]: | ||
| flattened_args = [] | ||
| for s in args: | ||
| if isinstance(s, list): | ||
| flattened_args.append(f"$( {shlex.join(s)} )") | ||
| else: | ||
| flattened_args.append(s) | ||
| return flattened_args | ||
|
|
||
| @cached_property | ||
| def setup(self) -> str: | ||
| cmds = [self.flatten_command(step) for step in self.job.get("setup", [])] | ||
| return "\n".join(shlex.join(s) for s in cmds) | ||
|
|
||
| @cached_property | ||
| def run(self) -> str: | ||
| cmds = [self.flatten_command(step) for step in self.job.get("run", [])] | ||
| return "\n".join(shlex.join(s) for s in cmds) | ||
|
|
||
| @cached_property | ||
| def logs(self) -> str: | ||
| return "\n".join(os.path.normpath(p) for p in self.job.get("logs", [])) | ||
|
|
||
| def to_dict(self): | ||
| """ | ||
| This is the interchange with the YAML file defining the Github jobs, so here | ||
| is where we must match the strings and expectations of the Github workflow. | ||
| """ | ||
| return { | ||
| "name": self.name, | ||
| "mx_version": self.mx_version, | ||
| "os": self.runs_on, | ||
| "python_version": self.python_version, | ||
| "setup_steps": self.setup, | ||
| "run_steps": self.run, | ||
| "python_packages": " ".join(self.python_packages), | ||
| "system_packages": " ".join(self.system_packages), | ||
| "provide_artifact": [self.upload_artifact.name, self.upload_artifact.pattern] if self.upload_artifact else None, | ||
| "require_artifact": [self.download_artifact.name, self.download_artifact.pattern] if self.download_artifact else None, | ||
| "logs": self.logs.replace("../", "${{ env.PARENT_DIRECTORY }}/"), | ||
| "env": self.env, | ||
| "downloads_steps": " ".join(self.downloads), | ||
| } | ||
|
|
||
| def __str__(self): | ||
| return str(self.to_dict()) | ||
|
|
||
| def __eq__(self, other): | ||
| if isinstance(other, Job): | ||
| return self.to_dict() == other.to_dict() | ||
| return NotImplemented | ||
|
|
||
| def __gt__(self, other): | ||
| if isinstance(other, Job): | ||
| if self.job.get("runAfter") == other.name: | ||
| return True | ||
| if self.download_artifact and not other.download_artifact: | ||
| return True | ||
| if self.download_artifact and other.upload_artifact: | ||
| if fnmatch.fnmatch(other.upload_artifact.name, self.download_artifact.name): | ||
| return True | ||
| if not self.upload_artifact: | ||
| return True | ||
| return False | ||
| return NotImplemented | ||
|
|
||
|
|
||
| def get_tagged_jobs(buildspec, target, filter=None): | ||
| jobs = [Job({"name": target}).to_dict()] | ||
| for job in sorted([Job(build) for build in buildspec.get("builds", [])]): | ||
| if not any(t for t in job.targets if t in [target]): | ||
| if "weekly" in job.targets and target == "tier1": pass | ||
| else: | ||
| continue | ||
| if filter and not re.match(filter, job.name): | ||
| continue | ||
| if [x for x in JOB_EXCLUSION_TERMS if x in str(job)]: | ||
| continue | ||
| jobs.append(job.to_dict()) | ||
| return jobs | ||
|
|
||
|
|
||
| def main(jsonnet_bin, ci_jsonnet, target, filter=None, indent=False): | ||
|
|
||
| result = subprocess.check_output([jsonnet_bin, ci_jsonnet], text=True) | ||
| buildspec = json.loads(result) | ||
| tagged_jobs = get_tagged_jobs(buildspec, target, filter=filter) | ||
| matrix = tagged_jobs | ||
| print(json.dumps(matrix, indent=2 if indent else None)) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| parser = argparse.ArgumentParser(description="Generate GitHub CI matrix from Jsonnet buildspec.") | ||
| parser.add_argument("jsonnet_bin", help="Path to jsonnet binary") | ||
| parser.add_argument("ci_jsonnet", help="Path to ci.jsonnet spec") | ||
| parser.add_argument("target", help="Target name (e.g., tier1)") | ||
| parser.add_argument("filter", nargs="?", default=None, help="Regex filter for job names (optional)") | ||
| parser.add_argument('--indent', action='store_true', help='Indent output JSON') | ||
| args = parser.parse_args() | ||
| main( | ||
| jsonnet_bin=args.jsonnet_bin, | ||
| ci_jsonnet=args.ci_jsonnet, | ||
| target=args.target, | ||
| filter=args.filter, | ||
| indent=args.indent or sys.stdout.isatty(), | ||
| ) | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,72 @@ | ||
| # ================================ | ||
| # | ||
| # This script is used by ci to merge several retagger report JSON files, which is then used | ||
| # by running python3 runner.py merge-tags-from-reports reports-merged.json | ||
| # | ||
| # ================================ | ||
|
|
||
| import os | ||
| import sys | ||
| import json | ||
| import glob | ||
| import argparse | ||
| from dataclasses import dataclass | ||
|
|
||
| # status we want to focus on | ||
| EXPORT_STATUS = ["FAILED"] | ||
|
|
||
| @dataclass | ||
| class Test: | ||
| name: str | ||
| status: str | ||
| duration: str | ||
|
|
||
|
|
||
| def read_report(path: str) -> list[Test]: | ||
| tests = [] | ||
| with open(path) as f: | ||
| data = json.load(f) | ||
| for result in data: | ||
| name, status, duration = result.values() | ||
|
Member
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can it use explicit dictionary keys? |
||
| if status in EXPORT_STATUS: tests.append(Test(f"{name}", status, duration)) | ||
|
|
||
| return tests | ||
|
|
||
| def merge_tests(report: list[Test], merged: dict[str, dict]): | ||
| for test in report: | ||
| if test.name not in merged: | ||
| merged[test.name] = test.__dict__ | ||
|
|
||
| def export_reports(merged: dict[str, dict], outfile: str): | ||
| with open(outfile, "w") as f: | ||
| json.dump(list(merged.values()), f) | ||
| print(f"=== Exported {len(merged)} ({EXPORT_STATUS}) tests to {f.name} ===") | ||
|
|
||
| def merge_reports(reports: list[str], outfile: str): | ||
| merged_reports = {} | ||
| for report in reports: | ||
| report_tests = read_report(report) | ||
| merge_tests(report_tests, merged_reports) | ||
|
|
||
| export_reports(merged_reports, outfile) | ||
|
|
||
| def main(outfile: str, source_dir: str, pattern: str): | ||
| path = f"{source_dir}/{pattern}" | ||
| files = glob.glob(path) | ||
|
|
||
| files = [file for file in files if file.endswith(".json")] | ||
| merge_reports(files, outfile) | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| parser = argparse.ArgumentParser(description="Merge unittest retagger report JSON files") | ||
| parser.add_argument("--outfile", help="Output file name (optional)", default="reports-merged.json") | ||
| parser.add_argument("--dir", help="Reports files directory (optional)", default=".") | ||
| parser.add_argument("--pattern", default="*", help="Pattern matching for input files (optional)") | ||
|
|
||
| args = parser.parse_args() | ||
| main( | ||
| outfile=args.outfile, | ||
| source_dir=args.dir, | ||
| pattern=args.pattern | ||
| ) | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks suspicious? It uses
tar, butget_download_linkdownloadszipfor windows. Also should it concatenate the commands with some Windows shell version of&&?