Create Argo Namespace
kubectl create namespace argo
kubectl get ns
Setup Argo Version and Install it
apiVersion: argoproj.io/v1alpha1 | |
kind: Workflow | |
metadata: | |
generateName: vote-ci- | |
spec: | |
entrypoint: main | |
arguments: | |
parameters: | |
- name: repo-url | |
value: "https://github.com/xxxxxx/vote.git" |
Short Answer: Yes, but only selectively.
As an AI Platform Engineer, the focus is on building, deploying, and optimizing AI/ML models at scale, not on developing new ML algorithms or performing deep data science research. However, to work effectively with Data Scientists and MLOps workflows, an AI Platform Engineer should understand key Data Science essentials related to:
✅ Understanding ML model workflows (How data moves through AI/ML pipelines)
✅ Feature Engineering & Feature Stores (How data is prepped for models)
✅ Fine-tuning & Inference Optimization (How models are trained and served efficiently)
✅ Evaluating Model Performance (Ensuring models meet production-quality standards)
Introduction:
In this mini-project, you will learn how to use Amazon S3 to host your resume, creating a publicly accessible web-based version of your professional profile. This is a practical skill that can be applied in real-world scenarios, such as job applications or networking events.
Project Goal:
#!/bin/bash | |
sudo su | |
curl -O https://raw.githubusercontent.com/wp-cli/builds/gh-pages/phar/wp-cli.phar | |
chmod +x wp-cli.phar | |
sudo mv wp-cli.phar /usr/local/bin/wp | |
wp transient delete --all --path=/srv/www/wordpress --allow-root | |
wp option get siteurl --path=/srv/www/wordpress --allow-root | |
wp option get home --path=/srv/www/wordpress --allow-root |
name: Test and Trigger Databricks Job | |
# Trigger the workflow on push to the main branch | |
on: | |
push: | |
branches: | |
- main | |
jobs: | |
test: |
import pytest | |
import Mixed_Language_Demo as databrics_code # Import the module | |
def test_dataframe_creation(): | |
""" | |
Test for the `create_dataframe` function | |
""" | |
# Create the DataFrame | |
df = databrics_code.create_dataframe() |
name: Test Notebook Code | |
# Trigger the workflow on push events to all branches | |
on: | |
push: | |
branches: | |
- '*' | |
jobs: | |
test: |
#!/bin/bash | |
echo "I: Installing Apache and PHP ..." | |
sudo apt update | |
sudo apt install -yq apache2 \ | |
ghostscript \ | |
libapache2-mod-php \ | |
mysql-client \ | |
php \ |
<IfModule mod_rewrite.c> | |
RewriteEngine On | |
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}] | |
RewriteBase / | |
RewriteRule ^index\.php$ - [L] | |
RewriteCond %{REQUEST_FILENAME} !-f | |
RewriteCond %{REQUEST_FILENAME} !-d | |
RewriteRule . /index.php [L] | |
</IfModule> |