05 Jun 2018, 19:00

AWS Athena, DynamoDB and boto3

Athena

import boto3

result_bucket = 'atehna-query-execution-result'
athena = boto3.client('athena')

res = athena.start_query_execution(
    QueryString = ,
    QueryExecutionContext = {
        'Database' = athena_database
    },
    ResutltConfiguration = {
        'OutputLocation': 's3://' + result_bucket + '/'
    }
)

query_id = res["QueryExecutionId"]

get_execution = athena.get_query_execution(QueryExecutionId=query_id)
query_state = get_execution['QueryExecution']['Status']['State']

while query_state == 'QUEUED' or query_state == 'RUNNING':
    time.sleep(3)
    query_state = athena.get_query_execution(QueryExecutionId=query_id)["QueryExecution"]["Status"]["State"]
    if query_state == "SUCCEEDED":
        # do some
        break
    elif query_state == 'CANCELLED' or query_state == 'FAILED':
        # do
        break

Dynamodb

Create Table

import boto3

table_name = ''
dynamodb = boto3.resource('dynamodb')

table = dynamodb.create_table(
    TableName=table_name,
    KeySchema=[
        {
            'AttributeName': '',
            'KeyType': ''
        },
        {
            'AttributeName': '',
            'KeyType': ''
        }
    ],
    AttributeDefinitions=[
        {
            'AttributeName': '',
            'Attributetype': ''
        },
        {
            'AttributeName': '',
            'Attributetype': ''
        }
    ],
    ProvisionedTheoughput={
        'ReadCapacityUnits': 1,
        'WriteCapacityUnits': 1
    }
)

Scan

import boto3

table_name = ''
dynamodb = boto3.resource('dynamodb', region_name='ap-northeast-1')
table = dynamodb.Table(table_name)

result = table.scan()

Query

import boto3
from boto3.dynamodb.conditions import Key

table_name = ''
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table(table_name)

result = table.query(
    KeyConditionExpression = Key('PartitionKey').eq('Pattern') & Key('RangeKey').between(start_date, end_date)
)

items = result['Items']

22 Feb 2018, 14:34

Yamaha Rtx Using Tunnel Template
tunnel select 1
 tunnel template 2-20
 tunnel encapsulation l2tp
 ipsec tunnel 1
  ipsec sa policy 1 1 esp aes-cbc sha-hmac
  ipsec ike keepalive use 1 off
  ipsec ike local address 1 192.168.11.1
  ipsec ike nat-traversal 1 on
  ipsec ike pre-shared-key 1 xxxxxxxxxxxxxxxxxxxxx
  ipsec ike remote address 1 any
 l2tp tunnel disconnect time 900
 l2tp keepalive use on 10 3
 l2tp keepalive log on
 l2tp syslog on
 ip tunnel tcp mss limit auto
 tunnel enable 1
tunnel select 2
 ip tunnel tcp mss limit auto

reference

22 Feb 2018, 13:57

Rubocop and Deploy With Bitbucket Pipelines

bitbucket-pipelines.yml

pipelines:
  branches:
    master:
      - step:
          name: rubocop test
          image: ruby:2.3.6
          script:
            - gem install rubocop
            - rubocop
      - step:
          deployment: production
          image: python:3.6
          name: copy archive to s3 bucket
          caches:
            - pip
          script:
            - pip3 install -U awscli
            - git archive HEAD -o my-app.tar.gz
            - aws s3 cp my-app.tar.gz s3://my-bucket

reference

19 Feb 2018, 14:50

Hugo Deployment With Rsync And bitbucket-pipelines

bitbucket-pipelines.yml

image: asato/hugo:latest

pipelines:
  branches:
    master:
      - step:
          script:
            - hugo
            - rsync -avz --delete public/ ${USER}@${HOST}:~/${DIR}

Dockerfile

FROM ubuntu

ENV HUGO_VER 0.36.1
ENV PKG_URL https://github.com/gohugoio/hugo/releases/download/v${HUGO_VER}/hugo_${HUGO_VER}_Linux-64bit.deb

RUN apt-get update -y && \
    apt-get install -y wget rsync openssh-clinet && \
    cd /tmp && \
    wget ${PKG_URL} && \
    dpkg -i hugo_${HUGO_VER}_Linux-64bit.deb && \
    rm -f hugo_${HUGO_VER}_Linux-64bit.deb && \
    apt-get clean && \
    rm -rf /var/lib/apt/lists/* /var/tmp/*

CMD hugo version

reference