Friday, May 29, 2020

3 cloud security mistakes you’re likely making without knowing

Those hastily moving to post-pandemic cloud-based platforms are likely to make some major security mistakes, depending on how fast they are moving. Why? This is new to most of them, there are few known best practices for cloud security, and humans get overwhelmed with the tasks of securely moving to the cloud quickly.

I’ve put together a short list of some of the security mistakes I see as enterprises rush to the cloud.

Mistake 1: Not gathering and reacting to operational security data in real time.

The notion of SIEM (security information and event management) means gathering operational security data in a central location to manage existing or forthcoming incidents in real time. We can leverage data as a weapon: supporting audits, correlating data, and using predictive analytics, all to gain better insights as to the state of security and to proactively combat attacks.

To read this article in full, please click here

Thursday, May 28, 2020

New – SaaS Contract Upgrades and Renewals for AWS Marketplace

AWS Marketplace currently contains over 7,500 listings from 1,500 independent software vendors (ISVs). You can browse the digital catalog to find, test, buy, and deploy software that runs on AWS:

Each ISV sets the pricing model and prices for their software. There are a variety of options available, including free trials, hourly or usage-based pricing, monthly, annual AMI pricing, and up-front pricing for 1-, 2-, and 3-year contracts. These options give each ISV the flexibility to define the models that work best for their customers. If their offering is delivered via a Software as a Service (SaaS) contract model, the seller can define the usage categories, dimensions, and contract length.

Upgrades & Renewals
AWS customers that make use of the SaaS and usage-based products that they find in AWS Marketplace generally start with a small commitment and then want to upgrade or renew them early as their workloads expand.

Today we are making the process of upgrading and renewing these contracts easier than ever before. While the initial contract is still in effect, buyers can communicate with sellers to negotiate a new Private Offer that best meets their needs. The offer can include additional entitlements to use the product, pricing discounts, a payment schedule, a revised contract end-date, and changes to the end-user license agreement (EULA), all in accord with the needs of a specific buyer.

Once the buyer accepts the offer, the new terms go in to effect immediately. This new, streamlined process means that sellers no longer need to track parallel (paper and digital) contracts, and also ensures that buyers receive continuous service.

Let’s say I am already using a product from AWS Marketplace and negotiate an extended contract end-date with the seller. The seller creates a Private Offer for me and sends me a link that I follow in order to find & review it:

I select the Upgrade offer, and I can see I have a new contract end date, the number of dimensions on my upgrade contract, and the payment schedule. I click Upgrade current contract to proceed:

I confirm my intent:

And I am good to go:

This feature is available to all buyers & SaaS sellers, and applies to SaaS contracts and contracts with consumption pricing.

Jeff;

Via AWS News Blog https://ift.tt/1EusYcK

Microsoft’s Project Tye aims to tame microservices development

Finding it tough to work with microservices? With Project Tye, Microsoft is offering an experimental developer tool intended to make it easier to build, test, and deploy microservices and distributed applications.

Microsoft believes Project Tye, a .NET Foundation project introduced May 21, will ease common pain points developers encounter when building applications that talk to a database or that are comprised of multiple services that communicate with each other. Project Tye is designed to make it easier for developers to run multiple application components simultaneously and to deploy distributed apps to platforms such as Kubernetes. 

To read this article in full, please click here

Single Sign-On between Okta Universal Directory and AWS

Enterprises adopting the AWS Cloud want to effectively manage identities. Having one central place to manage identities makes it easier to enforce policies, to manage access permissions, and to reduce the overhead by removing the need to duplicate users and user permissions across multiple identity silos. Having a unique identity also simplifies access for all of us, the users. We all have access to multiple systems, and we all have troubles to remember multiple distinct passwords. Being able to connect to multiple systems using one single combination of user name and password is a daily security and productivity gain. Being able to link an identity from one system with an identity managed on another trusted system is known as “Identity Federation“, which single sign-on is a subset of. Identity Federation is made possible thanks to industry standards such as Security Assertion Markup Language (SAML), OAuth, OpenID and others.

Recently, we announced a new evolution of AWS Single Sign-On, allowing you to link AWS identities with Azure Active Directory identities. We did not stop there. Today, we are announcing the integration of AWS Single Sign-On with Okta Universal Directory.

Let me show you the experience for System Administrators, then I will demonstrate the single sign-on experience for the users.

First, let’s imagine that I am an administrator for an enterprise that already uses Okta Universal Directory to manage my workforce identities. Now I want to enable a simple and easy to use access to our AWS environments for my users, using their existing identities. As most enterprises, I manage multiple AWS Accounts. I want more than just a single sign-on solution, I want to manage access to my AWS Accounts centrally. I do not want to duplicate my Okta groups and user memberships by hand, nor maintain multiple identity systems (Okta Universal Directory and one for each AWS Account I manage). I want to enable automatic user synchronization between Okta and AWS. My users will sign in to the AWS environments using the experience they are already familiar with in Okta.

Connecting Okta as an identity source for AWS Single Sign-On
The first step is to add AWS Single Sign-On as an “application” Okta users can connect to. I navigate to the Okta administration console and login with my Okta administrator credentials, then I navigate to the Applications tab.

Okta admin consoleI click the green Add Application button and I search for AWS SSO application. I click Add.

Okta add applicationI enter a name to the app (you can choose whatever name you like) and click Done.

On the next screen, I configure the mutual agreement between AWS Single Sign-On and Okta. I first download the SAML Meta Data file generated by Okta by clicking the blue link Identity Provider Metadata. I keep this file, I need it later to configure the AWS side of the single sign-on.

Okta Identity Provider metadata

Now that I have the metadata file, I open to the AWS Management Console in a new tab. I keep the Okta tab open as the procedure is not finished there yet. I navigate to AWS Single Sign-On and click Enable AWS SSO.

I click Settings in the navigation panel. I first set the Identity source by clicking the Change link and selecting External identity provider from the list of options. Secondly, I browse to and select the XML file I downloaded from Okta in the Identity provider metadata section.

SSO configure metadata

I click Next: Review, enter CONFIRM in the provided field, and finally click Change identity source to complete the AWS Single Sign-On side of the process. I take note of the two values AWS SSO ACS URL and AWS SSO Issuer URL as I must enter these in the Okta console.

AWS SSO Save URLsI return to the tab I left open to my Okta console, and copy the values for AWS SSO ACS URL and AWS SSO Issuer URL.

OKTA ACS URLsI click Save to complete the configuration

Configuring Automatic Provisioning
Now that Okta is configured for single sign-on for my users to connect using AWS Single Sign-On I’m going to enable automatic provisioning of user accounts. As new accounts are added to Okta, and assigned to the AWS SSO application, a corresponding AWS Single Sign-On user is created automatically. As an administrator, I do not need to do any work to configure a corresponding account in AWS to map to the Okta user.

From the AWS Single Sign-On Console, I navigate to Settings and then click the Enable identity synchronization link. This opens a dialog containing the values for the SCIM endpoint and an OAuth bearer access token (hidden by default). I need both of these values to use in the Okta application settings.

AWS SSO SCIMI switch back to the tab open on the Okta console, and click on Provisioning tab under the AWS SSO Application. I select Enable API Integration. Then I copy / paste the values Base URL (I paste the value copied in AWS Single Sign-On Console SCIM endpoint) and API Token (I paste the value copied AWS Single Sign-On Console Access token)

Okta API IntegrationI click Test API Credentials to verify everything works as expected. Then I click To App to enable users creation, update, and deactivate.

Okta Provisioning To App

With provisioning enabled, my final task is to assign the users and groups that I want to synchronize from Okta to AWS Single Sign-On. I click the Assignments tab and add Okta users and groups. I click Assign, and I select the Okta users and groups I want to have access to AWS.

OKTA AssignmentsThese users are synchronized to AWS Single Sign-On, and the users now see the AWS Single Sign-On application appear in their Okta portal.

Okta Portal User ViewTo verify user synchronization is working, I switch back to the AWS Single Sign-On console and select the Users tab. The users I assigned in Okta console are present.

AWS SSO User View

I Configured Single Sign-On, Now What?
Okta is now my single source of truth for my user identities and their assignment into groups, and periodic synchronization automatically creates corresponding identities in AWS Single Sign-On. My users sign into their AWS accounts and applications with their Okta credentials and experience, and don’t have to remember an additional user name or password. However, as things stand my users have only access to sign in. To manage permissions in terms of what they can access once signed into AWS, I must set up permissions in AWS Single Sign-On.

Back to AWS SSO Console, I click AWS Accounts on the left tab bar and select the account from my AWS Organizations that I am giving access to. For enterprises having multiple accounts for multiple applications or environment, it gives you the granularity to grant access to a subset of your AWS accounts.

AWS SSO Select AWS AccountI click Assign users to assign SSO users or groups to a set of IAM permissions. For this example, I assign just one user, the one with @example.com email address.

Assign SSO UsersI click Next: Permission sets and Create new permission set to create a set of IAM policies to describe the set of permissions I am granting to these Okta users. For this example, I am granting a read-only permission on all AWS services.SSO Permission setAnd voila, I am ready to test this setup.

SSO User Experience for the console
Now that I showed you the steps System Administrators take to configure the integration, let me show you what is the user experience.

As an AWS Account user, I can sign-in on Okta and get access to my AWS Management Console. I can start either from the AWS Single Sign-On user portal (the URL is on the AWS Single Sign-On settings page) or from the Okta user portal page and select the AWS SSO app.

I choose to start from the AWS SSO User Portal. I am redirected to the Okta login page. I enter my Okta credentials and I land on the AWS Account and Role selection page. I click on AWS Account, select the account I want to log into, and click Management console. After a few additional redirections, I land on the AWS Console page.

SSO User experience

SSO User Experience for the CLI
System administrators, DevOps engineers, Developers, and your automation scripts are not using the AWS console. They use the AWS Command Line Interface (CLI) instead. To configure SSO for the command line, I open a terminal and type aws configure sso. I enter the AWS SSO User Portal URL and the Region.

$aws configure sso
SSO start URL [None]: https://d-0123456789.awsapps.com/start
SSO Region [None]: eu-west-1
Attempting to automatically open the SSO authorization page in your default browser.
If the browser does not open or you wish to use a different device to authorize this request, open the following URL:

https://device.sso.eu-west-1.amazonaws.com/

Then enter the code:

AAAA-BBBB

At this stage, my default browser pops up and I enter my Okta credentials on the Okta login page. I confirm I want to enable SSO for the CLI.

SSO for the CLIand I close the browser when I receive this message:

AWS SSO CLI Close Browser Message

The CLI automatically resumes the configuration, I enter the default Region, the default output format and the name of the CLI profile I want to use.

The only AWS account available to you is: 012345678901
Using the account ID 012345678901
The only role available to you is: ViewOnlyAccess
Using the role name "ViewOnlyAccess"
CLI default client Region [eu-west-1]:
CLI default output format [None]:
CLI profile name [okta]:

To use this profile, specify the profile name using --profile, as shown:

aws s3 ls --profile okta

I am now ready to use the CLI with SSO. In my terminal, I type:

aws --profile okta s3 ls
2020-05-04 23:14:49 do-not-delete-gatedgarden-audit-012345678901
2015-09-24 16:46:30 elasticbeanstalk-eu-west-1-012345678901
2015-06-11 08:23:17 elasticbeanstalk-us-west-2-012345678901

If the machine you want to configure CLI SSO has no graphical user interface, you can configure SSO in headless mode, using the URL and the code provided by the CLI (https://device.sso.eu-west-1.amazonaws.com/ and AAAA-BBBB in the example above)

In this post, I showed how you can take advantage of the new AWS Single Sign-On capabilities to link Okta identities to AWS accounts for user single sign-on. I also make use of the automatic provisioning support to reduce complexity when managing and using identities. Administrators can now use a single source of truth for managing their users, and users no longer need to manage an additional identity and password to sign into their AWS accounts and applications.

AWS Single Sign-On with Okta is free to use, and is available in all Regions where AWS Single Sign-On is available. The full list is here.

To see all this in motion, you can check out the following demo video for more details on getting started.

-- seb Via AWS News Blog https://ift.tt/1EusYcK

Career roadmap: Cloud engineer

The shift to the cloud has been so pervasive that it has left a lot of companies with a skills gap they are struggling to fill with professionals who have cloud experience. That makes it a great time to be a cloud engineer.

Quantum AI is still years from enterprise prime time

Quantum computing’s potential to revolutionize AI depends on growth of a developer ecosystem in which suitable tools, skills, and platforms are in abundance. To be considered ready for enterprise production deployment, the quantum AI industry would have to, at the very least, reach the following key milestones:

To read this article in full, please click here

(Insider Story)

Wednesday, May 27, 2020

Red Hat Runtimes adds Kubernetes-native Quarkus Java stack

Red Hat’s Quarkus, a Kubernetes-native Java stack, is now supported on the Red Hat Runtimes platform for developing cloud-native applications.

A build of Quarkus is now part of Red Hat Runtimes middleware and integrates with the Red Hat OpenShift Kubernetes container platform for managing cloud deployments, Red Hat said this week.

Quarkus is intended for building lightweight, container-based microservices and serverless applications. Inclusion in Runtimes gives enterprise customers a version of the open source Java stack that is supported by Red Hat. Previously Quarkus had been available just with community support.

To read this article in full, please click here

New – AWS Amplify Libraries for Android and iOS

When you develop mobile applications, you must develop a set of cloud-powered functionalities for each project. For example, most applications require user authentication or detailed in-app analytics. Your application most probably calls REST or GraphQL APIs and is required to support offline scenario and data synchronization. AWS Amplify makes it easy to integrate such functionalities in your mobile and web applications.

AWS Amplify is a set of tools and services for building secure, scalable mobile and web applications. It is made out of three components: an open source set of libraries and UI components for adding cloud-powered functionalities, a command line interactive toolchain to create and manage a cloud backend, and the AWS Amplify Console, an AWS Service to deploy and host full stack serverless web applications.

Today, I am happy to announce the availability of Amplify iOS and Amplify Android libraries and tools, to help mobile application developers to easily build secure and scalable cloud-powered applications.

Until today, when you developed a cloud-powered mobile application, you were using a combination of tools and SDKs: the Amplify CLI to create and manage your backend, and one or several AWS Mobile SDKs to access the backend. In general, AWS Mobile SDKs are low-level wrappers around the AWS Services APIs. They require you to understand the API details and, most of the time, to write many lines of undifferentiated code, such as object (de)serialization, error handling, etc.

Amplify iOS and Amplify Android simplify this. First, they provide native libraries oriented around use-cases, such as Authentication, Data storage and access, machine learning predictions etc. They provide a declarative interface that enables you to programmatically apply best practices with abstractions. Thinking in terms of use cases instead of AWS Services results in higher-level abstraction, faster development cycles, and fewer lines of code. Secondly, they provide tools that integrate with your native IDE toolchain: XCode for iOS and Gradle for Android.

Using Amplify iOS or Amplify Android is our recommended way to integrate a cloud-based backend in your mobile application.

How to get started?
I’ve built two simple mobile applications (one on iOS and one on Android) to show you how to get started. The sources for these examples are available on my GitHub. As you see, I am not a graphic designer. The applications have a list of UI buttons to trigger different flows and the results are only visible in the console.

Amplify iOS & Android Demo

Amplify libraries for mobile are organized around categories for Auth, API (REST and GraphQL), Analytics, File Storage, DataStore, and Predictions. In this example, I use three categories. Auth, to implement sign-in, sign-up, and Login with Facebook flow. DataStore to use a query-able, on-device persistent storage engine. It seamlessly synchronizes data between the app and the cloud with built-in versioning, conflict detection and resolution capabilities. I also use Predictions category to add automatic translation between english and french languages.

Let’s review the four main steps and lines of code to get started on each platform. For a detailed step-by-step tutorial, have a look at the Amplify iOS or Amplify Android documentation.

The first step is to set up your project, to add required dependencies and build steps.

On iOS, you add a couple of lines to your Podfile and add the AWS Amplify build script to the build phase of your project.
On Android, you do the same in your Gradle file for the module and for the app.

// iOS Podfile
target 'amplify-lib-ios-demo' do
  # Comment the next line if you don't want to use dynamic frameworks
  use_frameworks!

  # Pods for amplify-lib-ios-demo
    pod 'Amplify'
    pod 'Amplify/Tools'

    pod 'AmplifyPlugins/AWSAPIPlugin'
    pod 'AmplifyPlugins/AWSDataStorePlugin'
    pod 'AmplifyPlugins/AWSCognitoAuthPlugin'
    pod 'AWSPredictionsPlugin'
// Android build.gradle fragment (Module: app) 
...
compileOptions {
    sourceCompatibility JavaVersion.VERSION_1_8
    targetCompatibility JavaVersion.VERSION_1_8
}
dependencies {
    implementation 'com.amplifyframework:core:1.0.0'
    implementation 'com.amplifyframework:aws-datastore:1.0.0'
    implementation 'com.amplifyframework:aws-api:1.0.0'
    implementation 'com.amplifyframework:aws-predictions:1.0.0'
    implementation 'com.amplifyframework:aws-auth-cognito:1.0.0'
}
...
// Android build.gradle fragment (Project: My Application)
...
repositories {
    mavenCentral()
    google()
    jcenter()
}
dependencies {
        classpath 'com.amplifyframework:amplify-tools-gradle-plugin:1.0.0'
}
apply plugin: 'com.amplifyframework.amplifytools'
...

On iOS, you also must manually add an amplify-tools.sh to your build steps.

When this is done, you type pod install for iOS or you sync the project with Gradle.

The second step is to add the plugins for each category to Amplify at application initialization time. On iOS, I am using didFinishLaunchingWithOptions from the AppDelegate. On Android, I am using onCreate from MainActivity. You’re free to initialize Amplify at any stage in your app, it is not necessary to be at app startup time.

    // iOS AppDelegate class
    func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
        
        do {
            try Amplify.add(plugin: AWSAPIPlugin())
            try Amplify.add(plugin: AWSDataStorePlugin(modelRegistration: AmplifyModels()))
            try Amplify.add(plugin: AWSCognitoAuthPlugin())
            try Amplify.add(plugin: AWSPredictionsPlugin())
            
            try Amplify.configure()
            print("Amplify initialized")
        } catch {
            print("Failed to configure Amplify \(error)")
        }
}
   // Android MainActivity class (Kotlin version)
   override fun onCreate(savedInstanceState: Bundle?) {
        // ...

        try {
            Amplify.addPlugin(AWSDataStorePlugin())
            Amplify.addPlugin(AWSApiPlugin())
            Amplify.addPlugin(AWSCognitoAuthPlugin())
            Amplify.addPlugin(AWSPredictionsPlugin())
            Amplify.configure(applicationContext)
            Log.i(TAG, "Initialized Amplify")
        } catch (error: AmplifyException) {
            Log.e(TAG, "Could not initialize Amplify", error)
        }
    }

The third step varies from one category to the other. Usually, it involves using the AWS Amplify command line to provision and configure your backend. Type commands like amplify add auth or amplify add predictions to configure a category.

For example, to configure the user authentication with Amazon Cognito and social identity providers, such as Login With Facebook, you type something like the below. This step is identical for iOS and Android as we are creating and configuring the cloud backend.

To learn how to configure single sign-on with social identity providers such as Facebook, Google or Amazon, you can refer to the step-by-step instructions I wrote in this Amplify iOS Workshop (I will update the workshop soon to take advantage of these new AWS Amplify libraries).

Configuring the DataStore involves creating a GraphQL schema for your data. Amplify generates native (Swift or Java) code to represent your data in your app. It transparently handles an offline datastore to store your data and sync them with the backend when network connectivity is available.

The fourth and last step is to actually invoke Amplify’s library code at runtime.

For example, to trigger an authentication using Amazon Cognito hosted web user interface, you use the following code:

// iOS (swift) in AppDelegate object
    func signIn() {
        _ = Amplify.Auth.signInWithWebUI(presentationAnchor: UIApplication.shared.windows.first!) { (result) in
            switch(result) {
                case .success(let result):
                    print(result)
                case .failure(let error):
                    print("Can not signin \(error)")
            }
        }
    }
// Android (Kotlin) in MainActivity 
    fun signIn(view: View?) {
        Amplify.Auth.signInWithWebUI(
            this,
            { result: AuthSignInResult -> Log.i(TAG, result.toString()) },
            { error: AuthException -> Log.e(TAG, error.toString()) }
        )
    }

The above triggers the following web view:

Hosted UI for Cognito

Similarly, to create an item in the Datastore (and persisting it to Amazon DynamoDB over GraphQL), you need the following code:

    // iOS 
    func create() {
        let note = Note(content: "Build iOS application")
        Amplify.DataStore.save(note) {
            switch $0 {
            case .success:
                print("Added note")
            case .failure(let error):
                print("Error adding note - \(error.localizedDescription)")
            }
        }
    }
   // Android 
    fun create(view: View?) {
        val note: Note = Note.builder()
            .content("Build Android application")
            .build()

        Amplify.DataStore.save(
            note,
            { success -> Log.i(TAG, "Saved item: " + success.item.content) },
            { error -> Log.e(TAG, "Could not save item to DataStore", error) }
        )

And to trigger a text translation with the Predictions category, you just need the following code:

    // iOS 
    func translate(text: String) {
        _ = Amplify.Predictions.convert(textToTranslate: text, language: LanguageType.english, targetLanguage: LanguageType.french) {
            switch $0 {
            case .success(let result):
                // update UI on main thread 
                DispatchQueue.main.async() {
                    self.data.translatedText = result.text
                }
            case .failure(let error):
                print("Error adding note - \(error.localizedDescription)")
            }
        }
    }
   // Android
    fun translate(view: View?) {
        Log.i(TAG, "Translating")

        val et : EditText = findViewById(R.id.toBeTranslated)
        val tv : TextView = findViewById(R.id.translated)

        Amplify.Predictions.translateText(
            et.text.toString(),
            LanguageType.ENGLISH,
            LanguageType.FRENCH,
            { success -> tv.setText(success.translatedText) },
            { failure -> Log.e(TAG, failure.localizedMessage) }
        )
    }

Short and slick isn’t it ?

Amplify Mobile demo translation

Price and Availability
AWS Amplify is available free of charge, you only pay for the backend services your application use, above the free tier.

Amplify iOS and Amplify Android are available today from the CocoaPods and Maven Central code repository. The source code is available on GitHub (iOS or Android). Do not hesitate to send us your feedback (Doc, iOS, and Android) or to send us a Pull Request :-)

I am also curious to learn about the amazing mobile apps you are building with AWS Amplify. Do not hesitate to share your screenshots or App Store links with me.

Happy building!

-- seb Via AWS News Blog https://ift.tt/1EusYcK

Introducing the latest AWS Heroes – May, 2020

Communities are now more important than ever. Member of local communities look to their leaders to provide guidance and mentorship on how to build AWS skills, solve technical problems, and grow their careers. Traditionally this AWS knowledge and community support is shared in many ways including via social media, blogs, open source projects, or by presenting at events or Meetups. More recently leaders are working to keep communities connected and supporting each other during challenging times.

The AWS Heroes program recognizes AWS enthusiasts who go above and beyond and have a wide-reaching impact in their community. Today, we are excited to introduce the newest AWS Heroes, including the first Heroes from South Africa and France:

Philippe Abdoulaye – Raleigh, USA

Community Hero Philippe Abdoulaye is the founder of ITaaSNow, an AWS advisory consulting business specializing in how to leverage the cloud to boost business performance. His main goal is to advise companies on how to transform IT infrastructure and IT organizations using AWS. He developed two architecture frameworks to speed up AWS architecture design and implementation. They include The Complete ITaaS Delivery Model and The AWS Virtual Data Center (VDC). He has authored seven books and 100+ articles on AWS, DevOps, and digital transformation, and gives conference talks on how to use AWS to grow businesses.

Jayesh Ahire – Pune, India

Machine Learning Hero Jayesh Ahire is an ML developer and researcher who enjoys working on distributed neural computers. He is also leader of the Pune AWS User Group, Pune Elasticsearch User Group, TensorFlow UG, and Twilio India Community. As an active advocate of AWS, Jayesh has delivered various talks around AWS AI Services, including Amazon SageMaker, at AWS Community Days and regular AWS meetups. He is an active blogger and has authored books on neural networks, reinforcement learning, blockchain, and simulation hypothesis.

Parthasarathi Balasubramanian – Chennai, India

Community Hero Parthasarathi (Partha) Balasubramanian is a Cloud Solution Architect at 8K Miles. He has been an AWS user since 2013, and he holds the AWS Certified Solution Architect Professional & Certified Security Specialty certifications. He founded the AWS User Group Chennai in 2018, which currently has 2800+ active members. He regularly organizes AWS User Group meetups as well as the the first-ever AWS Community Day Chennai 2019, which was a grand success with 450+ participants. Recently he started the AWS User Group India Facebook page for organizing live webinars, which has attracted 1100+ followers within just two months.

Matthew Bonig – Denver, USA

Data Hero Matthew Bonig is a consultant at Defiance Digital, specializing in the software development lifecycle and utilizing serverless technologies to increase productivity. He specializes in Amazon DynamoDB, AWS Cloud Development Kit, AWS API Gateway, and AWS Lambda and other technologies. Matthew spreads his knowledge to the data and larger tech communities through meetups in Denver and blog posts on his personal site. He even led a few official Amazon DynamoDB builder’s sessions at re:Invent 2019.

Veliswa Boya – Johannesburg, South Africa

Community Hero Veliswa Boya is a 2x certified AWS Cloud Engineer currently working with application teams on Financial Services cloud migration strategies and cloud architecture designs. She is a member of the Indoni Developers, a platform for African women in coding/tech. She speaks at meetups and was one of the speakers at the inaugural AWS Community Day Cape Town in 2019. Veliswa enjoys speaking and connecting with those who are new to tech and specifically new to AWS. She mentors young people who are looking to embark on AWS certification journeys, shares her own experiences, and gives guidance and support. Veliswa also likes to write about “what she’s learned so far on AWS” and publishes on her Medium blog.

Andrew Brown – Toronto, Canada

Community Hero Andrew Brown is the co-founder of ExamPro, a learning platform designed to help you pass AWS Certification exams. His AWS Certifications video courses are published for free with no ads on freeCodeCamp so that cloud knowledge is accessible to everyone. Andrew volunteers his time mentoring those looking to switch or start a career in the cloud industry. All you need to do is reach out and send him a message on LinkedIn. He’s also the AWS moderator and a top author for DEV. Andrew is active in the Toronto developer community, and you can meet him at AWS Toronto User Group events.

Kyuhyun Byun – Seoul, Korea

Serverless Hero Kyuhyun Byun is a leader of the AWSKRUG Serverless Group and CircleCI Korea User Group. He is a Software Engineer at Danggeun Market and was previously CTO at Movilest. He is interested in Serverless Architecture using AWS Lambda and AWS Glue, and enjoys building real-time services and data pipelines with Go language. He is a Serverless specialist who gives speeches at various conferences, user groups, and hands-on labs.

Elliott Cordo – Berkeley Heights, USA

Data Hero Elliott Cordo is a data engineering, data warehouse, information management, and technology innovation expert with a passion for helping transform data into powerful information. Elliott has built nearly a dozen cloud-native data platforms on AWS, ranging from data warehouses and data lakes to real-time activation platforms in companies ranging from small startups to large enterprises. In his current role, Elliott has built a complete data infrastructure leveraging AWS at Equinox Fitness, and most recently Equinox Media. These solutions have resulted in Equinox releasing open source tooling for AWS native data platforms, and led to publications in AWS and Equinox tech blogs and presentations at AWS re:Invent.

Sandip Das – Kolkata, India

Container Hero Sandip Das works as a Sr. Cloud Solutions Architect & DevOps Engineer for Gryphon Online Safety Inc. and a few other companies, where he is focused on developing cutting edge solutions using AWS. He develops, deploys, and manages containerized solutions on a daily basis using popular AWS containerization solutions ECS, EKS, and Fargate. Sandip finds blogging as a great way to share knowledge: He writes articles on Linkedin about AWS, Docker, Kubernetes, programming and more. He also creates video tutorials on his YouTube channel.

Rustem Feyzkhanov – San Jose, USA

Maching Learning Hero Rustem Feyzkhanov is a Machine Learning Engineer at Instrumental, where he creates analytical models for the manufacturing industry. He is passionate about the use of cloud infrastructure for AI/ML applications and is the author of the online courses “Practical Deep Learning on the Cloud” and “Serverless Deep Learning with TensorFlow and AWS Lambda.” He is also a creator of a few popular open-source repositories on GitHub about usage of AWS infrastructure for deep learning applications.

Hiromi Ito – Osaka, Japan

Community Hero Hiromi Ito is a Customer Marketing Manager at DigitalCube Co., Ltd. Since joining the Japanese AWS user group (JAWS-UG) in 2014, she has been actively involved in the creation of a women’s group in Japan, the overall running of JAWS-UG, and the community work of regional groups. In 2018, she was an organizer of JAWS DAYS 2018 and was named an AWS Samurai 2017. In 2019, she created the AWS Asian Women’s Association (Global Community) to host events and online meetups in the Asia region. She was selected as an AWS re:Invent Community Leader Diversity Grant recipient in 2019, and continues to expand AWS community activities to more of the world than ever before.

Zamira Jaupaj – Amsterdam, The Netherlands

Community Hero Zamira Jaupaj is a Solution Architect at Mobiquity, implementing AWS solutions and helping customers with their digital transformation. She has more than 6 years of experience implementing critical and complex AWS solutions with containers, serverless, and data analytics for small and enterprise companies. Zamira is the founder of AWS Meetup Albania and co-organizer of AWS Meetup Netherlands, coordinating several meetups with international speakers on a variety of topics. She also regularly speaks at technical conferences and authors tech blogs, sharing best practices about her AWS experiences on Medium.

Heewon Jeon – Seoul, Korea

Machine Learning Hero Heewon Jeon is an applied scientist at SK Telecom. He enjoys developing NLP open source projects as a hobby, and one of his activities is contributing to GluonNLP as a member of the Distributed (Deep) Machine Learning Community, or DMLC. He likes to use MXNet as his main deep learning platform because of the efficiency of training. Recently, he successfully trained Korean GPT2 (KoGPT2) with hundreds of millions of sentences on multiple machines in partnership with AWS internal teams. He is also an author on the MXNet and the AWS Korea blog, and has written numerous articles on model training and distribution.

Hyunmin Kim – Seoul, Korea

Community Hero Hyunmin Kim is a manager in Megazone Cloud’s Commercial Technology Center Solutions Architect team. Over the past three years, he has been working with many customers to develop experiences with AWS. The AWS community in Gangnam is growing rapidly, and is learning Kubernetes and Docker orchestration services like Docker and ECS and EKS together. Hyunmin helps organize the AWSKRUG Gangnam as well as the AWSKRUG Container Group, where he frequently presents on various topics.

Pascal Martin – Lyon, France

Container Hero Pascal Martin is DevOps Lead at Bedrock, where he has helped move an entire video streaming platform to AWS, running applications in containers on Kubernetes. He now focuses on scalability, resiliency, and cost efficiency, still leveraging Kubernetes and its ecosystem, managed services and serverless. He loves sharing his knowledge and experience and sometimes writes on his blog. The past few years, he has spoken about resiliency, Kubernetes, and the cloud at several meetups and conferences, including AWS Summit Paris, MixIT Lyon, and Forum PHP.

Kohei Matsushita – Tokyo, Japan

IoT Hero Kohei Matsushita is a technology evangelist at SORACOM. He delivers over 140 seminars and training sessions throughout Japan each year, and also publishes videos, blogs, and books on IoT technology, which are widely referenced in the IoT industry. From low-power wireless devices such as Raspberry Pi and the SORACOM LTE-M Button, to cloud integration with AWS IoT Core and other managed services, he is deeply versed in a wide variety of IoT architectures. He also actively participates in the Japan AWS (JAWS) User Group.

Serkan Özal – Istanbul, Turkey

Serverless Hero Serkan Özal is the CTO and founder of Thundra, a serverless centric application debugging, monitoring, and security solution. He mainly works on serverless architectures, distributed systems, and monitoring tools. Serkan publishes some of his work as open source tools and libraries on his GitHub account to be used and contributed by others for years. Serkan regularly writes technical blog posts both on his Medium account and the Thundra blog. In addition to his responsibilities in Thundra, he speaks at international conferences and moderates serverless workshops.

Jayaraman Palaniappan – Orange County, USA

Data Hero Jayaraman Palaniappan is the Head of Cloud Practice at Agilisium, focusing on AWS Big Data Technologies. For the past 7 years, he has mainly been involved in building Data Analytics Solutions using AWS Services for customers. Jayaraman helps conduct webinars, immersion days, and community days to evangelize Data Analytics & Big Data Services (EMR, Redshift, Kinesis, S3, & Glue) both within his organization and outside.

Marcelo Palladino – São Paulo, Brazil

Community Hero Marcelo Palladino is a Senior Software Engineer at Hi Platform, delivering cloud-based systems to help millions of customers every month. He has more than two decades of IT experience and holds six AWS certifications. Marcelo is a co-organizer of the AWS User Group São Paulo, AWS Community Day Brazil, and public speaker. He firmly believes that knowledge-sharing has a real impact within the community and is a great way to learn new things, help society, and help his country.

Rajarajan Pudupatti – Newport, USA

Container Hero Rajarajan Pudupatti is a Director of Cloud Platform architecture at Fidelity Investments, where he drives the engineering behind building next-gen model based cloud native platforms on AmazonEKS for running mission critical enterprise production workloads. Rajarajan is a #GitOps enthusiast and last year he spoke at Kubecon 2019 on building enterprise grade cloud platforms on AWS. He has also played instrumental roles in helping open source projects like eksctl and AWS Ingress controller meet enterprise standards, and making them production grade.

Cosmin Sanda – Copenhagen, Denmark

Machine Learning Hero Cosmin Sanda combines data engineering with data science to deliver end-to-end products that are scalable and resilient. He designs and implements both batch and real-time Big Data pipelines that transform and enhance data assets. Cosmin is adding value to the ML community by writing tutorials that explain best practices, data manipulations, and steps required to deliver real-life working applications. He also contributes to open source, provides support, and runs the Copenhagen Apache MXNet meetup group.

Bruce Sun – Hangzhou, China

Community Hero Bruce Sun is the hybrid cloud team leader of NetEase Games. Bruce dived deep into many AWS network services such as VPC, Direct Connect, and Global Accelerator to design their complex hybrid network architecture for serving their global gaming services. He also took lead to the AWS Nitro System and ARM-Based AWS Graviton Processors performance benchmark test which helped them innovate faster in a cost-effective way. Bruce introduced their use case of EC2 A1 Graviton in AWS re:Invent 2019. He also participated in AWS Game Tech Day Events in China to share best practices on AWS.

Amy Tseng – Washington D.C., USA

Data Hero Amy Tseng is a data engineering manager at Fannie Mae, specializing in data warehousing and data analytics. She presents at local meet-ups for Women in Technology to encourage more women to enter big data technology. She presented a session on Data Warehouse Migration at AWS re:Invent 2019 and a session on Implementing Hybrid Data Warehouses at the AWS Public Sector Summit in 2019. Amy is passionate about exploring new technologies and works closely with AWS product teams to keep exploring new features and enhancements for the services she uses. She encourages her team to think outside of the box and continue to innovate using these emerging technologies.

Rehan van der Merwe – Pretoria, South Africa

Community Hero Rehan van der Merwe is a developer, architect and AWS junkie at heart while consuming an unhealthy amount of coffee, focusing on Serverless and all that AWS has to offer. He organizes the AWS PTA Meetup and does the occasional presentation as well. He is an avid blogger and is helping the AWS community where possible, always lurking in all the #aws slack channels and answering questions. He currently holds 3 AWS certifications and is passionate about serverless and architecting big data and microservices.

Artem Yushev – Munich, Germany

IoT Hero Artem Yushev is a Staff Application Engineer at Infineon’s Digital Security Solutions division. In his role he evangelizes Open Source Software within his company and outside it. He is passionate about embedded security in general and its application for IoT in particular. His contributions focus on FreeRTOS and hardware security usage in FreeRTOS and with AWS IoT, focusing on making security practices easy to understand for broad audience, thus making them a requirement for any IoT device.

 

 

 

 

Learn more about the newest AWS Heroes and connect with a Hero near you by visiting the AWS Hero website.

Ross;

Via AWS News Blog https://ift.tt/1EusYcK