test

뜨개발자·2024년 3월 20일
0

8.9.0 Offline repository

시작

  • Kibana 로그인 페이지 커스텀 요건에 의해 시작함

참고 가이드

절차

  • Node.js 16.20.1 다운로드
$ pwd
/home/rohuser/nodejs/downloads

$ curl -L -O https://nodejs.org/dist/v16.20.1/node-v16.20.1-linux-x64.tar.xz

$ tar -xvf ./node-v16.20.1-linux-x64.tar.xz

$ mv ./node-v16.20.1-linux-x64 ../16.20.1
  • Yarn 1.22.19 다운로드
$ pwd
/home/rohuser/yarn/downloads

$ curl -L -O https://yarnpkg.com/latest.tar.gz

$ tar -xzvf ./latest.tar.gz

$ mv yarn-v1.22.19 ../1.22.19
  • Bazel 5.1.1 다운로드
$ pwd
/home/rohuser/bazel

$ ls
5.1.1  downloads

$ mkdir 5.1.1/bin

$ curl -L -O https://releases.bazel.build/5.1.1/release/bazel-5.1.1-linux-x86_64

$ mv ./bazel-5.1.1-linux-x86_64 ../5.1.1/bin/bazel

$ cd ../5.1.1/bin/

$ chmod +x ./bazel
  • gcc 업그레이드
    • CentOS 7의 gcc 최대 버전은 4.8.5임
    • Kibana bootstrap, build 작업을 진행하려면 더 높아야 하므로 9.2.0 버전으로 사용하고자 함
$ pwd
/home/rohuser/gcc/downloads

$ GCC_VERSION=9.2.0

$ curl -L -O https://ftp.gnu.org/gnu/gcc/gcc-${GCC_VERSION}/gcc-${GCC_VERSION}.tar.gz

$ tar xzvf gcc-${GCC_VERSION}.tar.gz

$ mkdir obj.gcc-${GCC_VERSION}

$ cd gcc-${GCC_VERSION}

$ ./contrib/download_prerequisites
$ cd ../obj.gcc-${GCC_VERSION}

$ ../gcc-${GCC_VERSION}/configure --disable-multilib --enable-languages=c,c++

$ make -j $(nproc)

$ make install

$ gcc --version
gcc (GCC) 9.2.0
Copyright (C) 2019 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
  • libxcb.so.1 파일 존재 여부 확인
    • Kibana 빌드하는데 사용하는 chromedriver 명령어는 libxcb.so.1 파일이 없으면 에러를 출력함
    • 파일이 없으면 yum 명령어를 통해 설치해야 함
# 에러 예시
$ ./chromedriver --version
./chromedriver: error while loading shared libraries: libxcb.so.1: cannot open shared object file: No such file or directory

# 파일 찾기
$ find / -name "*libxcb.so.1*"

# 없을 경우, 패키지 설치
$ sudo yum install libxcb
  • Node.js, Yarn, Bazel PATH에 등록
$ cd ~

$ cat .bash_profile
# .bash_profile 추가된 부분만 기재
NODE_HOME=$HOME/nodejs/16.20.1
YARN_HOME=$HOME/yarn/1.22.19
BAZEL_HOME=$HOME/bazel/5.1.1
PATH=$BAZEL_HOME/bin:/usr/local/bin:$YARN_HOME/bin:$NODE_HOME/bin:$JAVA_HOME/bin:$PATH:$HOME/.local/bin:$HOME/bin
export PATH
  • /usr/bin/python3.6 명령어 경로를 PYTHON 환경변수로 등록
$ cd ~

$ cat .bash_profile
# .bash_profile에 추가된 부분만 기재
PYTHON=/usr/bin/python3.6
export PYTHON
  • 세션 종료 후, SSH 재접속
  • Kibana github repository를 fork함
    • fork시 main branch외 전부 선택함
    • 결과 예시
  • Repository clone 진행
$ pwd
/data/rsh/repository/elastic

$ git clone -b v8.9.0 https://github.com/RohSeungHyeon/kibana.git ./kibana_offline_repository
  • Yarn offline mirror 설정
$ pwd
/data/rsh/repository/elastic/kibana_offline_repository

$ cat .yarnrc
# 추가된 부분만 기재
yarn-offline-mirror ".yarn-local-mirror"
yarn-offline-mirror-pruning true
  • 공통 bazel 설정 변경
$ pwd
/data/rsh/repository/elastic/kibana_offline_repository

$ cat .bazelrc.common
# 변경된 부분만 기재
build --disk_cache=.bazel-offline/disk-cache
fetch --disk_cache=.bazel-offline/disk-cache
query --disk_cache=.bazel-offline/disk-cache
sync --disk_cache=.bazel-offline/disk-cache
test --disk_cache=.bazel-offline/disk-cache
build --repository_cache=.bazel-offline/repository-cache
fetch --repository_cache=.bazel-offline/repository-cache
query --repository_cache=.bazel-offline/repository-cache
run --repository_cache=.bazel-offline/repository-cache
sync --repository_cache=.bazel-offline/repository-cache
test --repository_cache=.bazel-offline/repository-cache
  • 사용자 bazel 설정 추가
$ pwd
/data/rsh/repository/elastic/kibana_offline_repository

$ touch bazelrc.user

$ cat ./.bazelrc.user 
build --distdir=.bazel-offline
  • Kibana bootstrap, 의존성 파일 설치
$ pwd
/data/rsh/repository/elastic/kibana_offline_repository

$ yarn kbn bootstrap
  • 설정을 통해 추가된 디렉토리 확인
$ pwd
/data/rsh/repository/elastic/kibana_offline_repository

$ ls -lath | grep -E "mirror|offline"
drwxrwxr-x    2 rohuser rohuser  52K 10월 13 02:17 .yarn-local-mirror
drwxrwxr-x    4 rohuser rohuser   48 10월 11 15:11 .bazel-offline
  • Linux만 빌드할 수 있게 소스 코드 변경
    • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/lib/platform.ts
    • 주석 변경된 부분 참고
Before After
/*
 * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
 * or more contributor license agreements. Licensed under the Elastic License
 * 2.0 and the Server Side Public License, v 1; you may not use this file except
 * in compliance with, at your election, the Elastic License 2.0 or the Server
 * Side Public License, v 1.
 */

export type PlatformName = 'win32' | 'darwin' | 'linux';
export type PlatformArchitecture = 'x64' | 'arm64';

export class Platform {
  constructor(
    private name: PlatformName,
    private architecture: PlatformArchitecture,
    private buildName: string
  ) {}

  getName() {
    return this.name;
  }

  getArchitecture() {
    return this.architecture;
  }

  getBuildName() {
    return this.buildName;
  }

  getNodeArch() {
    return `${this.name}-${this.architecture}`;
  }

  isWindows() {
    return this.name === 'win32';
  }

  isMac() {
    return this.name === 'darwin';
  }

  isLinux() {
    return this.name === 'linux';
  }
}

export const ALL_PLATFORMS = [
  new Platform('linux', 'x64', 'linux-x86_64'),
  new Platform('linux', 'arm64', 'linux-aarch64'),
  new Platform('darwin', 'x64', 'darwin-x86_64'),
  new Platform('darwin', 'arm64', 'darwin-aarch64'),
  new Platform('win32', 'x64', 'windows-x86_64'),
];
/*
 * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
 * or more contributor license agreements. Licensed under the Elastic License
 * 2.0 and the Server Side Public License, v 1; you may not use this file except
 * in compliance with, at your election, the Elastic License 2.0 or the Server
 * Side Public License, v 1.
 */

export type PlatformName = 'win32' | 'darwin' | 'linux';
export type PlatformArchitecture = 'x64' | 'arm64';

export class Platform {
  constructor(
    private name: PlatformName,
    private architecture: PlatformArchitecture,
    private buildName: string
  ) {}

  getName() {
    return this.name;
  }

  getArchitecture() {
    return this.architecture;
  }

  getBuildName() {
    return this.buildName;
  }

  getNodeArch() {
    return `${this.name}-${this.architecture}`;
  }

  isWindows() {
    return this.name === 'win32';
  }

  isMac() {
    return this.name === 'darwin';
  }

  isLinux() {
    return this.name === 'linux';
  }
}

/** 변경된 부분 */
export const ALL_PLATFORMS = [
  new Platform('linux', 'x64', 'linux-x86_64'),
//  new Platform('linux', 'arm64', 'linux-aarch64'),
//  new Platform('darwin', 'x64', 'darwin-x86_64'),
//  new Platform('darwin', 'arm64', 'darwin-aarch64'),
//  new Platform('win32', 'x64', 'windows-x86_64'),
];
  • Elasticsearch GPG Key를 내려 받는 소스 코드 수정
    • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/tasks/fleet/download_elastic_gpg_key.ts
    • 소스 코드가 외부 API로 GPG Key를 호출하여 동작하는 방식임
      • 소스 코드에 작성된 GPG_KEY_SHA512 값과 실제 값이 다르므로 소스코드의 GPG_KEY_SHA512 값을 실제 값으로 설정
    • GPG Key를 파일로 받고 사용 후, 삭제하는 방식임
      • GPG Key를 따로 저장하는 부분 추가
      • 실제값은 repository 디렉토리에 위치에서 yarn kbn bootstrap 명령어 실행 후, 표준 출력 내용으로 알 수 있음
    • 주석 추가된 부분, 변경된 부분 참고
Before After
/*
 * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
 * or more contributor license agreements. Licensed under the Elastic License
 * 2.0 and the Server Side Public License, v 1; you may not use this file except
 * in compliance with, at your election, the Elastic License 2.0 or the Server
 * Side Public License, v 1.
 */

import Path from 'path';

import { ToolingLog } from '@kbn/tooling-log';

import { downloadToDisk } from '../../lib';

const ARTIFACTS_URL = 'https://artifacts.elastic.co/';
const GPG_KEY_NAME = 'GPG-KEY-elasticsearch';
const GPG_KEY_SHA512 =
  '84ee193cc337344d9a7da9021daf3f5ede83f5f1ab049d169f3634921529dcd096abf7a91eec7f26f3a6913e5e38f88f69a5e2ce79ad155d46edc75705a648c6';

export async function downloadElasticGpgKey(pkgDir: string, log: ToolingLog) {
  const gpgKeyUrl = ARTIFACTS_URL + GPG_KEY_NAME;
  const destination = Path.resolve(pkgDir, 'target/keys', GPG_KEY_NAME);
  log.info(`Downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}`);

  try {
    await downloadToDisk({
      log,
      url: gpgKeyUrl,
      destination,
      shaChecksum: GPG_KEY_SHA512,
      shaAlgorithm: 'sha512',
      skipChecksumCheck: false,
      maxAttempts: 3,
    });
  } catch (error) {
    throw new Error(
      `Error downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}: ${error.message}`
    );
  }
}
/*
 * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
 * or more contributor license agreements. Licensed under the Elastic License
 * 2.0 and the Server Side Public License, v 1; you may not use this file except
 * in compliance with, at your election, the Elastic License 2.0 or the Server
 * Side Public License, v 1.
 */

import Path from 'path';

import { ToolingLog } from '@kbn/tooling-log';

import { downloadToDisk } from '../../lib';

const ARTIFACTS_URL = 'https://artifacts.elastic.co/';
const GPG_KEY_NAME = 'GPG-KEY-elasticsearch';
//const GPG_KEY_SHA512 =
//  '84ee193cc337344d9a7da9021daf3f5ede83f5f1ab049d169f3634921529dcd096abf7a91eec7f26f3a6913e5e38f88f69a5e2ce79ad155d46edc75705a648c6';
/** GPG_KEY_SHA512 실제값 반영 (추가된 부분) */
const GPG_KEY_SHA512 =
  '62a567354286deb02baf5fc6b82ddf6c7067898723463da9ae65b132b8c6d6f064b2874e390885682376228eed166c1c82fe7f11f6c9a69f0c157029c548fa3d';

export async function downloadElasticGpgKey(pkgDir: string, log: ToolingLog) {
  const gpgKeyUrl = ARTIFACTS_URL + GPG_KEY_NAME;
  const destination = Path.resolve(pkgDir, 'target/keys', GPG_KEY_NAME);
  const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/keys', GPG_KEY_NAME); // 추가된 부분
  log.info(`Downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}, ${offlineDestination}`); // 수정된 부분

  try {
      /** Download from artifaces. */
   await downloadToDisk({
     log,
     url: gpgKeyUrl,
     destination,
     shaChecksum: GPG_KEY_SHA512,
     shaAlgorithm: 'sha512',
     skipChecksumCheck: false,
     maxAttempts: 3,
   });
    /** Download for offline mirror. (추가된 부분) */
   await downloadToDisk({
     log,
     url: gpgKeyUrl,
     destination: offlineDestination,
     shaChecksum: GPG_KEY_SHA512,
     shaAlgorithm: 'sha512',
     skipChecksumCheck: false,
     maxAttempts: 3,
   });
  } catch (error) {
    throw new Error(
      `Error downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}: ${error.message}`
    );
  }
}
  • Kibana 빌드 중 fleet plugin 관련 파일을 외부 API를 통해 내려 받는 소스 코드 수정
    • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/tasks/fleet/bundle_packages.ts
    • fleet plugin 관련 파일을 따로 저장하는 부분 추가
    • 주석 추가된 부분 참고
      Before After
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Fsp from 'fs/promises';
      import Path from 'path';
      
      import JSON5 from 'json5';
      import { safeLoad, safeDump } from 'js-yaml';
      import { asyncForEach } from '@kbn/std';
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { read, downloadToDisk, unzipBuffer, createZipFile, Config } from '../../lib';
      
      // Package storage v2 url
      export const PACKAGE_STORAGE_REGISTRY_URL = 'https://epr.elastic.co';
      
      interface FleetPackage {
        name: string;
        version: string;
        forceAlignStackVersion?: boolean;
        allowSyncToPrerelease?: boolean;
      }
      
      export async function bundleFleetPackages(pkgDir: string, log: ToolingLog, config: Config) {
        log.info('Fetching fleet packages from package registry');
      
        const configFilePath = config.resolveFromRepo('fleet_packages.json');
        const fleetPackages = (await read(configFilePath)) || '[]';
      
        const parsedFleetPackages: FleetPackage[] = JSON5.parse(fleetPackages);
      
        log.debug(
          `Found configured bundled packages: ${parsedFleetPackages
            .map((fleetPackage) => `${fleetPackage.name}-${fleetPackage.version || 'latest'}`)
            .join(', ')}`
        );
      
        await asyncForEach(parsedFleetPackages, async (fleetPackage) => {
          const stackVersion = config.getBuildVersion();
      
          let versionToWrite = fleetPackage.version;
      
          // If `forceAlignStackVersion` is set, we will rewrite the version specified in the config
          // to the version of the stack when writing the bundled package to disk. This allows us
          // to support some unique package development workflows, e.g. APM.
          if (fleetPackage.forceAlignStackVersion) {
            versionToWrite = stackVersion;
      
            log.debug(
              `Bundling ${fleetPackage.name}-${fleetPackage.version} as ${fleetPackage.name}-${stackVersion} to align with stack version`
            );
          }
      
          const archivePath = `${fleetPackage.name}-${versionToWrite}.zip`;
          const archiveUrl = `${PACKAGE_STORAGE_REGISTRY_URL}/epr/${fleetPackage.name}/${fleetPackage.name}-${fleetPackage.version}.zip`;
      
          const destination = Path.resolve(pkgDir, 'target/bundled_packages', archivePath);
          try {
            await downloadToDisk({
              log,
              url: archiveUrl,
              destination,
              shaChecksum: '',
              shaAlgorithm: 'sha512',
              skipChecksumCheck: true,
              maxAttempts: 3,
            });
      
            // If we're force aligning the version, we need to
            // 1. Unzip the downloaded archive
            // 2. Edit the `manifest.yml` file to include the updated `version` value
            // 3. Re-zip the archive and replace it on disk
            if (fleetPackage.forceAlignStackVersion) {
              const buffer = await Fsp.readFile(destination);
              const zipEntries = await unzipBuffer(buffer);
      
              const manifestPath = `${fleetPackage.name}-${fleetPackage.version}/manifest.yml`;
              const manifestEntry = zipEntries.find((entry) => entry.path === manifestPath);
      
              if (!manifestEntry || !manifestEntry.buffer) {
                log.debug(`Unable to find manifest.yml for stack aligned package ${fleetPackage.name}`);
                return;
              }
      
              const manifestYml = await safeLoad(manifestEntry.buffer.toString('utf8'));
              manifestYml.version = stackVersion;
      
              const newManifestYml = safeDump(manifestYml);
              manifestEntry.buffer = Buffer.from(newManifestYml, 'utf8');
      
              // Update all paths to use the new version
              zipEntries.forEach(
                (entry) => (entry.path = entry.path.replace(fleetPackage.version, versionToWrite!))
              );
      
              await createZipFile(zipEntries, destination);
            }
          } catch (error) {
            throw new Error(
              `Failed to download bundled package archive ${archivePath}: ${error.message}`
            );
          }
        });
      }
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Fsp from 'fs/promises';
      import Path from 'path';
      
      import JSON5 from 'json5';
      import { safeLoad, safeDump } from 'js-yaml';
      import { asyncForEach } from '@kbn/std';
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { read, downloadToDisk, unzipBuffer, createZipFile, Config } from '../../lib';
      
      // Package storage v2 url
      export const PACKAGE_STORAGE_REGISTRY_URL = 'https://epr.elastic.co';
      
      interface FleetPackage {
        name: string;
        version: string;
        forceAlignStackVersion?: boolean;
        allowSyncToPrerelease?: boolean;
      }
      
      export async function bundleFleetPackages(pkgDir: string, log: ToolingLog, config: Config) {
        log.info('Fetching fleet packages from package registry');
      
        const configFilePath = config.resolveFromRepo('fleet_packages.json');
        const fleetPackages = (await read(configFilePath)) || '[]';
      
        const parsedFleetPackages: FleetPackage[] = JSON5.parse(fleetPackages);
      
        log.debug(
          `Found configured bundled packages: ${parsedFleetPackages
            .map((fleetPackage) => `${fleetPackage.name}-${fleetPackage.version || 'latest'}`)
            .join(', ')}`
        );
      
        await asyncForEach(parsedFleetPackages, async (fleetPackage) => {
          const stackVersion = config.getBuildVersion();
      
          let versionToWrite = fleetPackage.version;
      
          // If `forceAlignStackVersion` is set, we will rewrite the version specified in the config
          // to the version of the stack when writing the bundled package to disk. This allows us
          // to support some unique package development workflows, e.g. APM.
          if (fleetPackage.forceAlignStackVersion) {
            versionToWrite = stackVersion;
      
            log.debug(
              `Bundling ${fleetPackage.name}-${fleetPackage.version} as ${fleetPackage.name}-${stackVersion} to align with stack version`
            );
          }
      
          const archivePath = `${fleetPackage.name}-${versionToWrite}.zip`;
          const archiveUrl = `${PACKAGE_STORAGE_REGISTRY_URL}/epr/${fleetPackage.name}/${fleetPackage.name}-${fleetPackage.version}.zip`;
      
          const destination = Path.resolve(pkgDir, 'target/bundled_packages', archivePath);
          const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/bundled_packages', archivePath); // 추가된 부분
          try {
            /** Download from registry. */
           await downloadToDisk({
             log,
             url: archiveUrl,
             destination,
             shaChecksum: '',
             shaAlgorithm: 'sha512',
             skipChecksumCheck: true,
             maxAttempts: 3,
           });
            /** Download for offline mirror. (추가된 부분) */
           await downloadToDisk({
             log,
             url: archiveUrl,
             destination: offlineDestination,
             shaChecksum: '',
             shaAlgorithm: 'sha512',
             skipChecksumCheck: true,
             maxAttempts: 3,
           });
      
            // If we're force aligning the version, we need to
            // 1. Unzip the downloaded archive
            // 2. Edit the `manifest.yml` file to include the updated `version` value
            // 3. Re-zip the archive and replace it on disk
            if (fleetPackage.forceAlignStackVersion) {
              const buffer = await Fsp.readFile(destination);
              const zipEntries = await unzipBuffer(buffer);
      
              const manifestPath = `${fleetPackage.name}-${fleetPackage.version}/manifest.yml`;
              const manifestEntry = zipEntries.find((entry) => entry.path === manifestPath);
      
              if (!manifestEntry || !manifestEntry.buffer) {
                log.debug(`Unable to find manifest.yml for stack aligned package ${fleetPackage.name}`);
                return;
              }
      
              const manifestYml = await safeLoad(manifestEntry.buffer.toString('utf8'));
              manifestYml.version = stackVersion;
      
              const newManifestYml = safeDump(manifestYml);
              manifestEntry.buffer = Buffer.from(newManifestYml, 'utf8');
      
              // Update all paths to use the new version
              zipEntries.forEach(
                (entry) => (entry.path = entry.path.replace(fleetPackage.version, versionToWrite!))
              );
      
              await createZipFile(zipEntries, destination);
            }
          } catch (error) {
            throw new Error(
              `Failed to download bundled package archive ${archivePath}: ${error.message}`
            );
          }
        });
      }
      
      • Elastic Agent 버전 목록을 외부 API를 통해 호출하는 소스 코드 수정
        • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/tasks/fetch_agent_versions_list.ts
        • API 결과값을 파싱하여 값을 반환하는 부분을 고정값을 반환하게 수정
        • 버전 목록은 소스 코드에 명시된 API를 직접 호출하여 알 수 있음
        • 주석 변경된 부분, 추가된 부분 참고
      Before After
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import fetch from 'node-fetch';
      import pRetry from 'p-retry';
      
      import { ToolingLog } from '@kbn/tooling-log';
      import { write, Task } from '../lib';
      
      // Endpoint maintained by the web-team and hosted on the elastic website
      const PRODUCT_VERSIONS_URL = 'https://www.elastic.co/api/product_versions';
      
      const isPr = () =>
        !!process.env.BUILDKITE_PULL_REQUEST && process.env.BUILDKITE_PULL_REQUEST !== 'false';
      
      const getAvailableVersions = async (log: ToolingLog) => {
        const options = {
          headers: {
            'Content-Type': 'application/json',
          },
        };
        log.info('Fetching Elastic Agent versions list');
      
        try {
          const results = await pRetry(() => fetch(PRODUCT_VERSIONS_URL, options), { retries: 3 });
          const rawBody = await results.text();
      
          if (results.status >= 400) {
            throw new Error(`Status code ${results.status} received from versions API: ${rawBody}`);
          }
      
          const jsonBody = JSON.parse(rawBody);
      
          const versions: string[] = (jsonBody.length ? jsonBody[0] : [])
            .filter((item: any) => item?.title?.includes('Elastic Agent'))
            .map((item: any) => item?.version_number);
      
          log.info(`Retrieved available versions`);
          return versions;
        } catch (error) {
          const errorMessage = 'Failed to fetch Elastic Agent versions list';
      
          if (isPr()) {
            // For PR jobs, just log the error as a warning and continue
            log.warning(errorMessage);
            log.warning(error);
          } else {
            // For non-PR jobs like nightly builds, log the error to stderror and throw
            // to ensure the build fails
            log.error(errorMessage);
            throw new Error(error);
          }
        }
        return [];
      };
      
      // Keep the elastic agent versions list in Fleet UI updated
      export const FetchAgentVersionsList: Task = {
        description: 'Build list of available Elastic Agent versions for Fleet UI',
      
        async run(config, log, build) {
          // Agent version list task is skipped for PR's, so as not to overwhelm the versions API
          if (isPr()) {
            return;
          }
      
          const versionsList = await getAvailableVersions(log);
          const AGENT_VERSION_BUILD_FILE = 'x-pack/plugins/fleet/target/agent_versions_list.json';
      
          if (versionsList !== []) {
            log.info(`Writing versions list to ${AGENT_VERSION_BUILD_FILE}`);
            await write(
              build.resolvePath(AGENT_VERSION_BUILD_FILE),
              JSON.stringify(versionsList, null, '  ')
            );
          }
        },
      };
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import fetch from 'node-fetch';
      import pRetry from 'p-retry';
      
      import { ToolingLog } from '@kbn/tooling-log';
      import { write, Task } from '../lib';
      
      // Endpoint maintained by the web-team and hosted on the elastic website
      const PRODUCT_VERSIONS_URL = 'https://www.elastic.co/api/product_versions';
      
      const isPr = () =>
        !!process.env.BUILDKITE_PULL_REQUEST && process.env.BUILDKITE_PULL_REQUEST !== 'false';
      
      const getAvailableVersions = async (log: ToolingLog) => {
        const options = {
          headers: {
            'Content-Type': 'application/json',
          },
        };
        log.info('Fetching Elastic Agent versions list');
      
        try {
          /** API 호출 부분 주석 (변경된 부분) */
          //    const results = await pRetry(() => fetch(PRODUCT_VERSIONS_URL, options), { retries: 3 });
          //    const rawBody = await results.text();
      
          //    if (results.status >= 400) {
          //      throw new Error(`Status code ${results.status} received from versions API: ${rawBody}`);
          //    }
      
          //    const jsonBody = JSON.parse(rawBody);
      
          /** Fetch from API (변경된 부분) */
          //    const versions: string[] = (jsonBody.length ? jsonBody[0] : [])
          //      .filter((item: any) => item?.title?.includes('Elastic Agent'))
          //      .map((item: any) => item?.version_number);
          /** Static values (추가된 부분) */
          log.info(`Return static values of Agent version list.`);
          const versions = [
            '7.17.14',
            '8.10.3',
            '8.10.2',
            '8.10.1',
            '8.10.0',
            '7.17.13',
            '8.9.2',
            '8.9.1',
            '7.17.12',
            '8.9.0',
            '7.17.11',
            '8.8.2',
            '8.8.1',
            '8.8.0',
            '7.17.10',
            '8.7.1',
            '8.7.0',
            '8.6.2',
            '7.17.9',
            '8.6.1',
            '8.6.0',
            '7.17.8',
            '8.5.3',
            '8.5.2',
            '8.5.1',
            '8.5.0',
            '7.17.7',
            '8.4.3',
            '8.4.2',
            '8.4.1',
            '7.17.6',
            '8.4.0',
            '8.3.3',
            '8.3.2',
            '8.3.1',
            '7.17.5',
            '8.3.0',
            '8.2.3',
            '8.0.0-alpha2',
            '8.0.0-alpha1',
            '8.2.2',
            '8.2.1',
            '7.17.4',
            '8.2.0',
            '7.17.3',
            '8.1.3',
            '8.1.1',
            '8.1.2',
            '8.1.0',
            '8.0.0-rc2',
            '8.0.0-rc1',
            '8.0.0-beta1',
            '8.0.1',
            '8.0.0',
            '7.17.2',
            '7.17.1',
            '7.17.0',
            '7.16.3',
            '7.16.2',
            '7.16.1',
            '7.16.0',
            '7.15.1',
            '7.15.2',
            '7.15.0',
            '7.14.2',
            '7.14.1',
            '7.14.0',
            '7.11.0',
            '7.11.1',
            '7.11.2',
            '7.12.0',
            '7.12.1',
            '7.13.0',
            '7.13.1',
            '7.13.2',
            '7.13.3',
            '7.13.4',
            '7.9.0',
            '7.9.1',
            '7.9.2',
            '7.9.3',
            '7.10.0',
            '7.10.1',
            '7.10.2',
            '7.8.0',
            '7.8.1',
          ];
      
          log.info(`Retrieved available versions`);
          return versions;
        } catch (error) {
          const errorMessage = 'Failed to fetch Elastic Agent versions list';
      
          if (isPr()) {
            // For PR jobs, just log the error as a warning and continue
            log.warning(errorMessage);
            log.warning(error);
          } else {
            // For non-PR jobs like nightly builds, log the error to stderror and throw
            // to ensure the build fails
            log.error(errorMessage);
            throw new Error(error);
          }
        }
        return [];
      };
      
      // Keep the elastic agent versions list in Fleet UI updated
      export const FetchAgentVersionsList: Task = {
        description: 'Build list of available Elastic Agent versions for Fleet UI',
      
        async run(config, log, build) {
          // Agent version list task is skipped for PR's, so as not to overwhelm the versions API
          if (isPr()) {
            return;
          }
      
          const versionsList = await getAvailableVersions(log);
          const AGENT_VERSION_BUILD_FILE = 'x-pack/plugins/fleet/target/agent_versions_list.json';
      
          if (versionsList !== []) {
            log.info(`Writing versions list to ${AGENT_VERSION_BUILD_FILE}`);
            await write(
              build.resolvePath(AGENT_VERSION_BUILD_FILE),
              JSON.stringify(versionsList, null, '  ')
            );
          }
        },
      };
      
      • Kibana 빌드 실행
      $ pwd
      /data/rsh/repository/elastic/kibana_offline_repository
      
      $ yarn build --skip-os-packages --release --skip-docker-ubuntu --skip-docker-cloud --skip-docker-serverless --skip-docker-ubi --skip-docker-contexts --skip-archives
      • 추가된 디렉토리 확인
      $ pwd
      /data/rsh/repository/elastic/kibana_offline_repository
      
      $ ls ./.fleet-offline/*
      ./.fleet-offline/bundled_packages:
      apm-8.9.0-SNAPSHOT.zip  elastic_agent-1.8.0.zip  fleet_server-1.3.1.zip                 profiler_collector-8.9.0.zip            profiler_symbolizer-8.9.0.zip        synthetics-1.0.1.zip
      apm-8.9.0.zip           endpoint-8.9.0.zip       profiler_collector-8.9.0-SNAPSHOT.zip  profiler_symbolizer-8.9.0-SNAPSHOT.zip  security_detection_engine-8.8.4.zip
      
      ./.fleet-offline/keys:
      GPG-KEY-elasticsearch
      
      $ ls -lath | grep "node_binaries"
      drwxrwxr-x    3 rohuser rohuser   21 10월 11 19:32 .node_binaries
      • Elasticsearch GPG Key를 내려 받는 소스 코드 수정
        • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/tasks/fleet/download_elastic_gpg_key.ts
        • /data/rsh/repository/elastic/kibana_offline_repository/.fleet-offline/keys 경로로 내려 받은 GPG Key를 활용하게 소스 코드 변경
        • 주석 변경된 부분, 추가된 부분 참고
      Before After
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Path from 'path';
      
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { downloadToDisk } from '../../lib';
      
      const ARTIFACTS_URL = 'https://artifacts.elastic.co/';
      const GPG_KEY_NAME = 'GPG-KEY-elasticsearch';
      //const GPG_KEY_SHA512 =
      //  '84ee193cc337344d9a7da9021daf3f5ede83f5f1ab049d169f3634921529dcd096abf7a91eec7f26f3a6913e5e38f88f69a5e2ce79ad155d46edc75705a648c6';
      /** GPG_KEY_SHA512 실제값 반영 (추가된 부분) */
      const GPG_KEY_SHA512 =
        '62a567354286deb02baf5fc6b82ddf6c7067898723463da9ae65b132b8c6d6f064b2874e390885682376228eed166c1c82fe7f11f6c9a69f0c157029c548fa3d';
      
      export async function downloadElasticGpgKey(pkgDir: string, log: ToolingLog) {
        const gpgKeyUrl = ARTIFACTS_URL + GPG_KEY_NAME;
        const destination = Path.resolve(pkgDir, 'target/keys', GPG_KEY_NAME);
        const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/keys', GPG_KEY_NAME); // 추가된 부분
        log.info(`Downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}, ${offlineDestination}`); // 수정된 부분
      
        try {
            /** Download from artifaces. */
         await downloadToDisk({
           log,
           url: gpgKeyUrl,
           destination,
           shaChecksum: GPG_KEY_SHA512,
           shaAlgorithm: 'sha512',
           skipChecksumCheck: false,
           maxAttempts: 3,
         });
          /** Download for offline mirror. (추가된 부분) */
         await downloadToDisk({
           log,
           url: gpgKeyUrl,
           destination: offlineDestination,
           shaChecksum: GPG_KEY_SHA512,
           shaAlgorithm: 'sha512',
           skipChecksumCheck: false,
           maxAttempts: 3,
         });
        } catch (error) {
          throw new Error(
            `Error downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}: ${error.message}`
          );
        }
      }
      
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Path from 'path';
      
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { downloadToDisk, scanCopy } from '../../lib'; // 추가된 부분
      
      const ARTIFACTS_URL = 'https://artifacts.elastic.co/';
      const GPG_KEY_NAME = 'GPG-KEY-elasticsearch';
      //const GPG_KEY_SHA512 =
      //  '84ee193cc337344d9a7da9021daf3f5ede83f5f1ab049d169f3634921529dcd096abf7a91eec7f26f3a6913e5e38f88f69a5e2ce79ad155d46edc75705a648c6';
      const GPG_KEY_SHA512 =
        '62a567354286deb02baf5fc6b82ddf6c7067898723463da9ae65b132b8c6d6f064b2874e390885682376228eed166c1c82fe7f11f6c9a69f0c157029c548fa3d';
      const distPerms = (rec: Record) => (rec.type === 'file' ? 0o644 : 0o755); // 추가된 부분
      
      export async function downloadElasticGpgKey(pkgDir: string, log: ToolingLog) {
        const gpgKeyUrl = ARTIFACTS_URL + GPG_KEY_NAME;
      //  const destination = Path.resolve(pkgDir, 'target/keys', GPG_KEY_NAME); // 변경된 부분
      //  const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/keys', GPG_KEY_NAME); // 변경된 부분
      //  log.info(`Downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}, ${offlineDestination}`);
        const destination = Path.resolve(pkgDir, 'target/keys'); // 추가된 부분
        const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/keys'); // 추가된 부분
        log.info(`Copying Elastic GPG key from ${offlineDestination} to ${destination}`); // 추가된 부분
      
        try {
            /** Download from artifaces. (변경된 부분) */
      //    await downloadToDisk({
      //      log,
      //      url: gpgKeyUrl,
      //      destination,
      //      shaChecksum: GPG_KEY_SHA512,
      //      shaAlgorithm: 'sha512',
      //      skipChecksumCheck: false,
      //      maxAttempts: 3,
      //    });
          /** Download for offline mirror. (변경된 부분) */
      //    await downloadToDisk({
      //      log,
      //      url: gpgKeyUrl,
      //      destination: offlineDestination,
      //      shaChecksum: GPG_KEY_SHA512,
      //      shaAlgorithm: 'sha512',
      //      skipChecksumCheck: false,
      //      maxAttempts: 3,
      //    });
          /** Copy from offline mirror. (추가된 부분) */
          await scanCopy({
            source: offlineDestination,
            destination,
            permissions: distPerms,
          }); 
        } catch (error) {
          throw new Error(
            `Error downloading Elastic GPG key from ${gpgKeyUrl} to ${destination}: ${error.message}`
          );
        }
      }
      
      • Kibana 빌드 중 fleet plugin 관련 파일을 외부 API를 통해 내려 받는 소스 코드 수정
        • 경로: /data/rsh/repository/elastic/kibana_offline_repository/src/dev/build/tasks/fleet/bundle_packages.ts
        • /data/rsh/repository/elastic/kibana_offline_repository/.fleet-offline/bundled_packages 경로로 내려 받은 fleet 관련 archive를 활용하게 소스 코드 변경
        • 주석 변경된 부분, 추가된 부분 참고
      Before After
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Fsp from 'fs/promises';
      import Path from 'path';
      
      import JSON5 from 'json5';
      import { safeLoad, safeDump } from 'js-yaml';
      import { asyncForEach } from '@kbn/std';
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { read, downloadToDisk, unzipBuffer, createZipFile, Config } from '../../lib';
      
      // Package storage v2 url
      export const PACKAGE_STORAGE_REGISTRY_URL = 'https://epr.elastic.co';
      
      interface FleetPackage {
        name: string;
        version: string;
        forceAlignStackVersion?: boolean;
        allowSyncToPrerelease?: boolean;
      }
      
      export async function bundleFleetPackages(pkgDir: string, log: ToolingLog, config: Config) {
        log.info('Fetching fleet packages from package registry');
      
        const configFilePath = config.resolveFromRepo('fleet_packages.json');
        const fleetPackages = (await read(configFilePath)) || '[]';
      
        const parsedFleetPackages: FleetPackage[] = JSON5.parse(fleetPackages);
      
        log.debug(
          `Found configured bundled packages: ${parsedFleetPackages
            .map((fleetPackage) => `${fleetPackage.name}-${fleetPackage.version || 'latest'}`)
            .join(', ')}`
        );
      
        await asyncForEach(parsedFleetPackages, async (fleetPackage) => {
          const stackVersion = config.getBuildVersion();
      
          let versionToWrite = fleetPackage.version;
      
          // If `forceAlignStackVersion` is set, we will rewrite the version specified in the config
          // to the version of the stack when writing the bundled package to disk. This allows us
          // to support some unique package development workflows, e.g. APM.
          if (fleetPackage.forceAlignStackVersion) {
            versionToWrite = stackVersion;
      
            log.debug(
              `Bundling ${fleetPackage.name}-${fleetPackage.version} as ${fleetPackage.name}-${stackVersion} to align with stack version`
            );
          }
      
          const archivePath = `${fleetPackage.name}-${versionToWrite}.zip`;
          const archiveUrl = `${PACKAGE_STORAGE_REGISTRY_URL}/epr/${fleetPackage.name}/${fleetPackage.name}-${fleetPackage.version}.zip`;
      
          const destination = Path.resolve(pkgDir, 'target/bundled_packages', archivePath);
          const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/bundled_packages', archivePath); // 추가된 부분
          try {
            /** Download from registry. */
           await downloadToDisk({
             log,
             url: archiveUrl,
             destination,
             shaChecksum: '',
             shaAlgorithm: 'sha512',
             skipChecksumCheck: true,
             maxAttempts: 3,
           });
            /** Download for offline mirror. (추가된 부분) */
           await downloadToDisk({
             log,
             url: archiveUrl,
             destination: offlineDestination,
             shaChecksum: '',
             shaAlgorithm: 'sha512',
             skipChecksumCheck: true,
             maxAttempts: 3,
           });
      
            // If we're force aligning the version, we need to
            // 1. Unzip the downloaded archive
            // 2. Edit the `manifest.yml` file to include the updated `version` value
            // 3. Re-zip the archive and replace it on disk
            if (fleetPackage.forceAlignStackVersion) {
              const buffer = await Fsp.readFile(destination);
              const zipEntries = await unzipBuffer(buffer);
      
              const manifestPath = `${fleetPackage.name}-${fleetPackage.version}/manifest.yml`;
              const manifestEntry = zipEntries.find((entry) => entry.path === manifestPath);
      
              if (!manifestEntry || !manifestEntry.buffer) {
                log.debug(`Unable to find manifest.yml for stack aligned package ${fleetPackage.name}`);
                return;
              }
      
              const manifestYml = await safeLoad(manifestEntry.buffer.toString('utf8'));
              manifestYml.version = stackVersion;
      
              const newManifestYml = safeDump(manifestYml);
              manifestEntry.buffer = Buffer.from(newManifestYml, 'utf8');
      
              // Update all paths to use the new version
              zipEntries.forEach(
                (entry) => (entry.path = entry.path.replace(fleetPackage.version, versionToWrite!))
              );
      
              await createZipFile(zipEntries, destination);
            }
          } catch (error) {
            throw new Error(
              `Failed to download bundled package archive ${archivePath}: ${error.message}`
            );
          }
        });
      }
      
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import Fsp from 'fs/promises';
      import Path from 'path';
      
      import JSON5 from 'json5';
      import { safeLoad, safeDump } from 'js-yaml';
      import { asyncForEach } from '@kbn/std';
      import { ToolingLog } from '@kbn/tooling-log';
      
      import { read, downloadToDisk, unzipBuffer, createZipFile, Config, scanCopy } from '../../lib'; // 추가된 부분
      
      const distPerms = (rec: Record) => (rec.type === 'file' ? 0o644 : 0o755); // 추가된 부분
      
      // Package storage v2 url
      export const PACKAGE_STORAGE_REGISTRY_URL = 'https://epr.elastic.co';
      
      interface FleetPackage {
        name: string;
        version: string;
        forceAlignStackVersion?: boolean;
        allowSyncToPrerelease?: boolean;
      }
      
      export async function bundleFleetPackages(pkgDir: string, log: ToolingLog, config: Config) {
        log.info('Fetching fleet packages from package registry');
      
        const configFilePath = config.resolveFromRepo('fleet_packages.json');
        const fleetPackages = (await read(configFilePath)) || '[]';
      
        const parsedFleetPackages: FleetPackage[] = JSON5.parse(fleetPackages);
      
        log.debug(
          `Found configured bundled packages: ${parsedFleetPackages
            .map((fleetPackage) => `${fleetPackage.name}-${fleetPackage.version || 'latest'}`)
            .join(', ')}`
        );
      
        /** 저장용 디렉토리에서 활용하는 디렉토리로 내용 복사 (추가된 부분) */
        const destination = Path.resolve(pkgDir, 'target/bundled_packages');
        const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/bundled_packages');
        log.info(`Copy from ${offlineDestination} to ${destination}.`);
        try {
          await scanCopy({
            source: offlineDestination,
            destination: destination,
            permissions: distPerms,
          });
        } catch (error) {
          throw new Error(
            `Failed to Copy from ${offlineDestination} to ${destination}: ${error.message}`
          );
        }
      
        await asyncForEach(parsedFleetPackages, async (fleetPackage) => {
          const stackVersion = config.getBuildVersion();
      
          let versionToWrite = fleetPackage.version;
      
          // If `forceAlignStackVersion` is set, we will rewrite the version specified in the config
          // to the version of the stack when writing the bundled package to disk. This allows us
          // to support some unique package development workflows, e.g. APM.
          if (fleetPackage.forceAlignStackVersion) {
            versionToWrite = stackVersion;
      
            log.debug(
              `Bundling ${fleetPackage.name}-${fleetPackage.version} as ${fleetPackage.name}-${stackVersion} to align with stack version`
            );
          }
      
          const archivePath = `${fleetPackage.name}-${versionToWrite}.zip`;
          const archiveUrl = `${PACKAGE_STORAGE_REGISTRY_URL}/epr/${fleetPackage.name}/${fleetPackage.name}-${fleetPackage.version}.zip`;
      
          const destination = Path.resolve(pkgDir, 'target/bundled_packages', archivePath);
          const offlineDestination = Path.resolve(pkgDir, '../../../../../.fleet-offline/bundled_packages', archivePath);
          try {
            /** Download from registry. (변경된 부분) */
      //      await downloadToDisk({
      //        log,
      //        url: archiveUrl,
      //        destination,
      //        shaChecksum: '',
      //        shaAlgorithm: 'sha512',
      //        skipChecksumCheck: true,
      //        maxAttempts: 3,
      //      });
            /** Download for offline mirror. (변경된 부분) */
      //      await downloadToDisk({
      //        log,
      //        url: archiveUrl,
      //        destination: offlineDestination,
      //        shaChecksum: '',
      //        shaAlgorithm: 'sha512',
      //        skipChecksumCheck: true,
      //        maxAttempts: 3,
      //      });
      
            // If we're force aligning the version, we need to
            // 1. Unzip the downloaded archive
            // 2. Edit the `manifest.yml` file to include the updated `version` value
            // 3. Re-zip the archive and replace it on disk
            if (fleetPackage.forceAlignStackVersion) {
              const buffer = await Fsp.readFile(destination);
              const zipEntries = await unzipBuffer(buffer);
      
              const manifestPath = `${fleetPackage.name}-${fleetPackage.version}/manifest.yml`;
              const manifestEntry = zipEntries.find((entry) => entry.path === manifestPath);
      
              if (!manifestEntry || !manifestEntry.buffer) {
                log.debug(`Unable to find manifest.yml for stack aligned package ${fleetPackage.name}`);
                return;
              }
      
              const manifestYml = await safeLoad(manifestEntry.buffer.toString('utf8'));
              manifestYml.version = stackVersion;
      
              const newManifestYml = safeDump(manifestYml);
              manifestEntry.buffer = Buffer.from(newManifestYml, 'utf8');
      
              // Update all paths to use the new version
              zipEntries.forEach(
                (entry) => (entry.path = entry.path.replace(fleetPackage.version, versionToWrite!))
              );
      
              await createZipFile(zipEntries, destination);
            }
          } catch (error) {
            throw new Error(
              `Failed to download bundled package archive ${archivePath}: ${error.message}`
            );
          }
        });
      }
      
      • Node.js native module를 외부 API를 통해 내려 받는 소스 코드 수정
        • 경로: /data/rsh/repository/elastic/kibana/src/dev/build/tasks/patch_native_modules_task.ts
        • 이 소스 코드는 repository 하위 .native_modules 라는 디렉토리를 생성하여 파일을 저장함
        • 빌드할 때마다 매번 다운로드를 진행하므로 해당 로직을 무효화함
        • 주석 변경된 부분 참고
      Before After
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import path from 'path';
      
      import { ToolingLog } from '@kbn/tooling-log';
      
      import {
        deleteAll,
        downloadToDisk,
        gunzip,
        untar,
        Task,
        Config,
        Build,
        Platform,
        read,
      } from '../lib';
      
      const DOWNLOAD_DIRECTORY = '.native_modules';
      
      interface Package {
        name: string;
        version: string;
        destinationPath: string;
        extractMethod: string;
        archives: Record<
          string,
          {
            url: string;
            sha256: string;
          }
        >;
      }
      
      const packages: Package[] = [
        {
          name: 're2',
          version: '1.17.4',
          destinationPath: 'node_modules/re2/build/Release/re2.node',
          extractMethod: 'gunzip',
          archives: {
            'darwin-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/darwin-x64-93.gz',
              sha256: '9558c5cb39622e9b3653203e772b129d6c634e7dbd7af1b244352fc1d704601f',
            },
            'linux-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/linux-x64-93.gz',
              sha256: '4d06747b266c75b6f7ced93977692c0586ce6a52924cabb569bd966378941aa1',
            },
      
            // ARM builds are currently done manually as Github Actions used in upstream project
            // do not natively support an ARM target.
      
            // From an AWS Graviton instance running Ubuntu or a GCE T2A instance running Debian:
            // * install build-essential package: `sudo apt-get update` + `sudo apt install build-essential`
            // * install nvm and the node version used by the Kibana repository
            // * `npm install re2@1.17.7`
            // * re2 will build itself on install
            // * `cp node_modules/re2/build/Release/re2.node linux-arm64-$(node -e "console.log(process.versions.modules)")`
            // * `gzip linux-arm64-*`
            // * capture the sha256 with: `shasum -a 256 linux-arm64-*`
            // * upload the `linux-arm64-*.gz` artifact to the `yarn-prebuilt-artifacts` bucket in GCS using the correct version number
            'linux-arm64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/linux-arm64-93.gz',
              sha256: '25409584f76f3d6ed85463d84adf094eb6e256ed1cb0b754b95bcbda6691fc26',
            },
      
            // A similar process is necessary for building on ARM macs:
            // * bootstrap and re2 will build itself on install
            // * `cp node_modules/re2/build/Release/re2.node darwin-arm64-$(node -e "console.log(process.versions.modules)")`
            // * `gzip darwin-arm64-*`
            // * capture the sha256 with: `shasum -a 256 darwin-arm64-*`
            // * upload the `darwin-arm64-*.gz` artifact to the `yarn-prebuilt-artifacts` bucket in GCS using the correct version number
            'darwin-arm64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/darwin-arm64-93.gz',
              sha256: 'd4b708749ddef1c87019f6b80e051ed0c29ccd1de34f233c47d8dcaddf803872',
            },
      
            'win32-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/win32-x64-93.gz',
              sha256: '0320d0c0385432944c6fb3c8c8fcd78d440ce5626f7618f9ec71d88e44820674',
            },
          },
        },
      ];
      
      async function getInstalledVersion(config: Config, packageName: string) {
        const packageJSONPath = config.resolveFromRepo(
          path.join('node_modules', packageName, 'package.json')
        );
        const json = await read(packageJSONPath);
        const packageJSON = JSON.parse(json);
        return packageJSON.version;
      }
      
      async function patchModule(
        config: Config,
        log: ToolingLog,
        build: Build,
        platform: Platform,
        pkg: Package
      ) {
        const installedVersion = await getInstalledVersion(config, pkg.name);
        if (installedVersion !== pkg.version) {
          throw new Error(
            `Can't patch ${pkg.name}'s native module, we were expecting version ${pkg.version} and found ${installedVersion}`
          );
        }
        const platformName = platform.getNodeArch();
        const archive = pkg.archives[platformName];
        const archiveName = path.basename(archive.url);
        const downloadPath = config.resolveFromRepo(DOWNLOAD_DIRECTORY, pkg.name, archiveName);
        const extractPath = build.resolvePathForPlatform(platform, pkg.destinationPath);
        log.debug(`Patching ${pkg.name} binaries from ${archive.url} to ${extractPath}`);
      
        await deleteAll([extractPath], log);
        await downloadToDisk({
          log,
          url: archive.url,
          destination: downloadPath,
          shaChecksum: archive.sha256,
          shaAlgorithm: 'sha256',
          maxAttempts: 3,
        });
        switch (pkg.extractMethod) {
          case 'gunzip':
            await gunzip(downloadPath, extractPath);
            break;
          case 'untar':
            await untar(downloadPath, extractPath);
            break;
          default:
            throw new Error(`Extract method of ${pkg.extractMethod} is not supported`);
        }
      }
      
      export const PatchNativeModules: Task = {
        description: 'Patching platform-specific native modules',
        async run(config, log, build) {
          for (const pkg of packages) {
            await Promise.all(
              config.getTargetPlatforms().map(async (platform) => {
                await patchModule(config, log, build, platform, pkg);
              })
            );
          }
        },
      };
      
      /*
       * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
       * or more contributor license agreements. Licensed under the Elastic License
       * 2.0 and the Server Side Public License, v 1; you may not use this file except
       * in compliance with, at your election, the Elastic License 2.0 or the Server
       * Side Public License, v 1.
       */
      
      import path from 'path';
      
      import { ToolingLog } from '@kbn/tooling-log';
      
      import {
        deleteAll,
        downloadToDisk,
        gunzip,
        untar,
        Task,
        Config,
        Build,
        Platform,
        read,
      } from '../lib';
      
      const DOWNLOAD_DIRECTORY = '.native_modules';
      
      interface Package {
        name: string;
        version: string;
        destinationPath: string;
        extractMethod: string;
        archives: Record<
          string,
          {
            url: string;
            sha256: string;
          }
        >;
      }
      
      const packages: Package[] = [
        {
          name: 're2',
          version: '1.17.4',
          destinationPath: 'node_modules/re2/build/Release/re2.node',
          extractMethod: 'gunzip',
          archives: {
            'darwin-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/darwin-x64-93.gz',
              sha256: '9558c5cb39622e9b3653203e772b129d6c634e7dbd7af1b244352fc1d704601f',
            },
            'linux-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/linux-x64-93.gz',
              sha256: '4d06747b266c75b6f7ced93977692c0586ce6a52924cabb569bd966378941aa1',
            },
      
            // ARM builds are currently done manually as Github Actions used in upstream project
            // do not natively support an ARM target.
      
            // From an AWS Graviton instance running Ubuntu or a GCE T2A instance running Debian:
            // * install build-essential package: `sudo apt-get update` + `sudo apt install build-essential`
            // * install nvm and the node version used by the Kibana repository
            // * `npm install re2@1.17.7`
            // * re2 will build itself on install
            // * `cp node_modules/re2/build/Release/re2.node linux-arm64-$(node -e "console.log(process.versions.modules)")`
            // * `gzip linux-arm64-*`
            // * capture the sha256 with: `shasum -a 256 linux-arm64-*`
            // * upload the `linux-arm64-*.gz` artifact to the `yarn-prebuilt-artifacts` bucket in GCS using the correct version number
            'linux-arm64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/linux-arm64-93.gz',
              sha256: '25409584f76f3d6ed85463d84adf094eb6e256ed1cb0b754b95bcbda6691fc26',
            },
      
            // A similar process is necessary for building on ARM macs:
            // * bootstrap and re2 will build itself on install
            // * `cp node_modules/re2/build/Release/re2.node darwin-arm64-$(node -e "console.log(process.versions.modules)")`
            // * `gzip darwin-arm64-*`
            // * capture the sha256 with: `shasum -a 256 darwin-arm64-*`
            // * upload the `darwin-arm64-*.gz` artifact to the `yarn-prebuilt-artifacts` bucket in GCS using the correct version number
            'darwin-arm64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/darwin-arm64-93.gz',
              sha256: 'd4b708749ddef1c87019f6b80e051ed0c29ccd1de34f233c47d8dcaddf803872',
            },
      
            'win32-x64': {
              url: 'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2/uhop/node-re2/releases/download/1.17.4/win32-x64-93.gz',
              sha256: '0320d0c0385432944c6fb3c8c8fcd78d440ce5626f7618f9ec71d88e44820674',
            },
          },
        },
      ];
      
      async function getInstalledVersion(config: Config, packageName: string) {
        const packageJSONPath = config.resolveFromRepo(
          path.join('node_modules', packageName, 'package.json')
        );
        const json = await read(packageJSONPath);
        const packageJSON = JSON.parse(json);
        return packageJSON.version;
      }
      
      async function patchModule(
        config: Config,
        log: ToolingLog,
        build: Build,
        platform: Platform,
        pkg: Package
      ) {
        const installedVersion = await getInstalledVersion(config, pkg.name);
        if (installedVersion !== pkg.version) {
          throw new Error(
            `Can't patch ${pkg.name}'s native module, we were expecting version ${pkg.version} and found ${installedVersion}`
          );
        }
        const platformName = platform.getNodeArch();
        const archive = pkg.archives[platformName];
        const archiveName = path.basename(archive.url);
        const downloadPath = config.resolveFromRepo(DOWNLOAD_DIRECTORY, pkg.name, archiveName);
        const extractPath = build.resolvePathForPlatform(platform, pkg.destinationPath);
        log.debug(`Patching ${pkg.name} binaries from ${archive.url} to ${extractPath}`);
      
        await deleteAll([extractPath], log);
        /** Download from proxy. (변경된 부분) */
      //  await downloadToDisk({
      //    log,
      //    url: archive.url,
      //    destination: downloadPath,
      //    shaChecksum: archive.sha256,
      //    shaAlgorithm: 'sha256',
      //    maxAttempts: 3,
      //  });
        switch (pkg.extractMethod) {
          case 'gunzip':
            await gunzip(downloadPath, extractPath);
            break;
          case 'untar':
            await untar(downloadPath, extractPath);
            break;
          default:
            throw new Error(`Extract method of ${pkg.extractMethod} is not supported`);
        }
      }
      
      export const PatchNativeModules: Task = {
        description: 'Patching platform-specific native modules',
        async run(config, log, build) {
          for (const pkg of packages) {
            await Promise.all(
              config.getTargetPlatforms().map(async (platform) => {
                await patchModule(config, log, build, platform, pkg);
              })
            );
          }
        },
      };
      
      • Kibana 빌드 진행하여 동작하는지 확인
        • Node.js 다운로드를 스킵하는 옵션 추가하여 진행
      $ pwd
      /data/rsh/repository/elastic/kibana_offline_repository
      
      $ rm -rf ./build ./target
      
      $ yarn build --skip-os-packages --release --skip-docker-ubuntu --skip-docker-cloud --skip-docker-serverless --skip-docker-ubi --skip-docker-contexts --skip-archives
      • Kibana bootstrap부터 빌드까지 테스트 진행
        • bootstrap은 --offline 옵션을 사용하여 진행
      $ pwd
      /data/rsh/repository/elastic/kibana_offline_repository
      
      $ rm -rf ./bazel-* ./node_modules ./build ./target
      
      # Bootstrap 실패 후, 재시도 할 경우, 다시 동작하기도 함
      $ yarn kbn bootstrap --offline
      
      # Bootstrap 성공 후, 빌드 테스트 진행
      # Build 중 실패 후, 재시도 할 경우, 다시 동작하기도 함
      $ yarn build --skip-os-packages --release --skip-docker-ubuntu --skip-docker-cloud --skip-docker-serverless --skip-docker-ubi --skip-docker-contexts --skip-archives
      • 빌드까지 성공했을 경우, repository를 오프라인 Linux 환경으로 이관할 준비가 됨
      • 위 초반 설정을 참고하여 오프라인 Linux 환경에 Node.js, Yarn, Bazel, GCC, Python 3, libxcb.so.1 준비 진행
        • 테스트는 CentOS 7.9 VM을 Minmal setting으로 생성하여 진행함
      • Offline repository가 있는 온라인 환경에서 아래 3개의 내용을 오프라인 환경으로 옮겨야 함
        • Kibana offline repository
        • Home cache
        • npm cache
        • 압축 후, 압축 파일을 SFTP로 옮김
      # Kibana offline repository 준비
      $ pwd
      /data/rsh/repository/elastic/kibana_offline_repository
      
      $ rm -rf ./bazel-* ./build ./target
      
      $ cd ..
      
      $ tar -czvf ./kibana_offline_repository.tar.gz ./kibana_offline_repository
      
      # Home cache 준비
      $ cd ~
      
      $ tar -czvf ./cache.tar.gz ./.cache
      
      # npm cache 준비
      $ cd ~
      
      $ tar -czvf ./npm_cache.tar.gz ./.npm
      • 오프라인 환경으로 옮긴 압축 파일 압축 해제 후, 이동
      $ pwd
      /home/centos7/kibana_offline_setup
      
      $ ls
      dependency  repository
      
      # Kibana offline repository
      $ cd repository
      
      $ tar -xzvf ./kibana_offline_repository.tar.gz
      
      $ mv ./kibana_offline_repository.tar.gz ~/repository/elastic
      
      # Home cache
      $ cd /home/centos7/kibana_offline_setup/dependency
      
      $ tar -xzvf ./cache.tar.gz
      
      $ mv ./.cache ~/
      
      # npm cache
      $ cd /home/centos7/kibana_offline_setup/dependency
      
      $ tar -xzvf ./npm_cache.tar.gz
      
      $ mv ./.npm ~/
      • Kibana bootstrap 진행
        • 실패 후, 재시도하면 되는 경우가 있음
        • 계속 같은 에러로 실패 할 경우, bazel-*, node_modules, target, build 삭제 후, 다시 진행 시도
      $ pwd
      /home/centos7/repository/elastic/kibana_offline_repository
      
      $ yarn kbn bootstrap --offline
      • Kibana 빌드 진행
        • 실패 후, 재시도하면 되는 경우가 있음
      $ yarn build --skip-os-packages --release --skip-docker-ubuntu --skip-docker-cloud --skip-docker-serverless --skip-docker-ubi --skip-docker-contexts --skip-archives --skip-initialize --skip-node-download
      • 빌드된 디렉토리 확인
      $ pwd
      /home/centos7/repository/elastic/kibana_offline_repository/build/default
      
      $ ls
      kibana-8.9.0-linux-x86_64
      
      $ ls kibana-8.9.0-linux-x86_64/
      LICENSE.txt  NOTICE.txt  README.txt  bin  config  data  logs  node  node_modules  package.json  packages  plugins  src  x-pack
      • 빌드된 내용으로 Elasticsearch 8.9.0 버전과 연동 테스트 진행 (생략함)

      마치며

      • Kibana offline repository에서 bootstrap 및 빌드한 Kibana가 실행이 되는 것을 확인함

      • 버전별로 변경해야 하는 부분은 다를 수 있으므로 늘 온라인, 오프라인 환경에서 테스트 하면서 진행해야 함

profile
뜨개질하는 개발자

0개의 댓글