Deeply inspired by the original game from late 20th century, this game will be used as a coding exercise in which we expect to practice:
- Data structures (lists, queues, matrices, etc.)
- Algorithms
- Unit testing
As optional goal we have:
- Threads
build: | |
plugins: | |
- groupId: org.apache.maven.plugins | |
artifactId: maven-compiler-plugin | |
version: 3.8.1 | |
configuration: | |
release: 11 | |
target: 11 | |
source: 11 |
import io.undertow.Undertow; | |
import io.undertow.server.handlers.encoding.EncodingHandler; | |
import io.undertow.server.HttpHandler; | |
import io.undertow.server.HttpServerExchange; | |
import javax.inject.Singleton; | |
import kikaha.core.modules.Module; | |
/** | |
* Compresses all responses sent to the http client (usually the browser) | |
* using GZip algorithm. It will reduce the amount of data being sent out |
import io.vertx.core.AbstractVerticle | |
import io.vertx.core.http.HttpMethod | |
import io.vertx.core.http.HttpServerOptions | |
import io.vertx.core.http.HttpServerRequest | |
import io.vertx.core.http.HttpServerResponse | |
import io.vertx.ext.web.Router | |
import io.vertx.ext.web.handler.BodyHandler | |
import java.util.concurrent.atomic.AtomicBoolean | |
/** |
#!/usr/bin/env bash | |
# Configures an Amazon Linux instance. It includes: | |
# - to have its logs automatically sent to CloudWatch | |
# - pre-configured with Python, Ruby and Java runtime environment | |
# - CodeDeploy agent up and running expecting for deployment tasks | |
# VARIABLES | |
INSTANCE_ID=`/usr/bin/curl -s http://169.254.169.254/latest/meta-data/instance-id` | |
URL_CODE_DEPLOY=https://aws-codedeploy-${region}.s3.amazonaws.com/latest/codedeploy-agent.noarch.rpm | |
URL_CORRETTO=https://d1f2yzg3dx5xke.cloudfront.net/java-1.8.0-amazon-corretto-1.8.0_202.b08-1.amzn2.x86_64.rpm |
import java.lang.management.ManagementFactory | |
import java.util.concurrent.atomic.AtomicLong | |
class MetricService { | |
val megaByte = 1024.0 * 1024.0 | |
private val metrics = mutableMapOf<String, Metric>().apply { | |
val memory = ManagementFactory.getMemoryMXBean() | |
put("jvm.heap.used", Gauge(isLocal = true) { memory.heapMemoryUsage.used / megaByte }) |
We cannot assume that a Lambda container will hold state in case of failure/timeout. AWS controls its container in an unpredictable way, hence we need to assume that it is stateless. Thus, the workaround implemented in the lambda.js
file would not work very well when the container is not properly reused, leading us to undesired side-effects.
Here is the log output, showing that, in case of an retry-attempt, the execution wasn't droped, causing the system to execute it twice.
var http = require('http');
var lastReqId;
O DynamoDB é um banco desenhado aos moldes Chave->Valor. Na prática a intenção dele é obter dados rapidamente através do seu ID (Chave). Infelizmente, por sua natureza, ele não possui a capacidade de efetuar COUNT, MAX, MIN e outras funções de agregação que existem nos bancos tradicionais. Vejo que exitem algumas alternativas, mas a aplicação vai depender do quanto ($ ou horas) estás disposito a investir.
First of all, ensure you have a script which chkconfig
is able to run. The file script.sh
bellow is an useful
script boilerplate in which you can use as starting point if don't have one.
Then, execute the following command:
$ chkconfig --level 345 script.sh on
import java.lang.reflect.*; | |
import java.util.*; | |
import java.util.function.Function; | |
import com.amazonaws.services.dynamodbv2.model.AttributeValue; | |
import lombok.*; | |
@SuppressWarnings( "unchecked" ) | |
public abstract class DynamoReflectiveSerializer { | |
private static final Map<Class, Function<Field, FieldAttributeValueReader>> fieldHoldersFactories = new HashMap<>(); |