Blockchain shows open source’s fatal flaw—and a way forward

“26,000 new blockchain projects last year!” screamed the headline. “But only 8 percent remain active!” The implication is that blockchain’s future is at risk, given the high mortality rate among its offspring. Yet nothing could be further from the truth. If anything, we need many more blockchain projects to fail to clear out some of the noise, leaving room for “Linux of blockchain”-type projects to remain.

And yet there is cause for concern, though not in blockchain specifically. Instead, the greater concern should be for open source, which has never been more popular with software users even as the developer population feeding it has remained flat. Unless we can find ways to encourage more contributions, open source efforts like blockchain threaten to crumble under the weight of user expectations unmet by developer productivity.

To read this article in full, please click here

Source: New feed

What’s new in TensorFlow machine learning

TensorFlow, Google’s contribution to the world of machine learning and data science, is a general framework for quickly developing neural networks. Despite being relatively new, TensorFlow has already found wide adoption as a common platform for such work, thanks to its powerful abstractions and ease of use.

TensorFlow 1.4 API additions

TensorFlow Keras API

The biggest changes in TensorFlow 1.4 involve two key additions to the core TensorFlow API. The tf.keras API allows users to employ the Keras API, a neural network library that predates TensorFlow but is quickly being displaced by it. The tf.keras API allows software using Keras to be transitioned to TensorFlow, either by using the Keras interface permanently, or as a prelude to the software being reworked to use TensorFlow natively.

To read this article in full, please click here

Source: New feed

What is Apache Spark? The big data analytics platform explained

From its humble beginnings in the AMPLab at U.C. Berkeley in 2009, Apache Spark has become one of the key big data distributed processing frameworks in the world. Spark can be deployed in a variety of ways, provides native bindings for the Java, Scala, Python, and R programming languages, and supports SQL, streaming data, machine learning, and graph processing. You’ll find it used by banks, telecommunications companies, games companies, governments, and all of the major tech giants such as Apple, Facebook, IBM, and Microsoft.

Out of the box, Spark can run in a standalone cluster mode that simply requires the Apache Spark framework and a JVM on each machine in your cluster. However, it’s more likely you’ll want to take advantage of a resource or cluster management system to take care of allocating workers on demand for you. In the enterprise, this will normally mean running on Hadoop YARN (this is how the Cloudera and Hortonworks distributions run Spark jobs), but Apache Spark can also run on Apache Mesos, while work is progressing on adding native support for Kubernetes.

To read this article in full, please click here

Source: New feed

Kotlin frameworks: A survey of JVM development tools

Kotlin, the statically typed alterative Java language for JVM and Android development, is showing signs of making it into the big leagues, getting support in development frameworks.

For example, the Spring Framework now supports Kotlin. And there are new frameworks such as Javalin and Ktor that support Kotlin.

Kotlin frameworks: Support in the Spring Framework

Best known as a venerable Java framework featuring dependency injection, Pivotal’s Spring Framework gained Kotlin support in the Spring 5.0 version released this year. Spring 5.0 supports Kotlin extensions, offering an alternative to utility classes and Kotlin-specific class hierarchies, as well as adding Kotlin features to Spring itself.

To read this article in full, please click here

Source: New feed

21 plug-ins to make the most of Eclipse

21 plug-ins to pump up Eclipse
eclipse plug ins intro

Image by Thinkstock/Eclipse

Eclipse continues to be one of the most popular developer IDEs, thanks in large part to the broad ecosystem of plug-ins the platform supports. It may have begun as a tool for Java, but more and more people use it for other languages and frameworks, from Scala and Kotlin to JavaScript and Node.js. 

To read this article in full, please click here

Source: New feed

Beta JetBrains IDE moves Kotlin apps out of the JVM

JetBrains has made available the Kotlin/Native technology, which creates native binaries for Kotlin code so they can run without a Java virtual machine. A beta version of the CLion IDE allows Kotlin programs to be compiled directly to an executable machine-code format.

Kotlin is a statically typed Java language alternative that began on the JVM. But many platforms can’t run JVMs, restricting the use of Kotlin to JVM-friendly platforms like Android. The Kotlin/Native preview’s supported target platforms include MacOS, iOS, Ubuntu Linux, and Raspberry Pi.

To read this article in full, please click here

Source: New feed

Java 101: Datastructures and algorithms in Java, Part 2

An array is a fundamental datastructure category, and a building block for more complex datastructures. In this second part of my Java 101 introduction to datastructures and algorithms, you will learn how arrays are understood and used in Java programming. I introduce the concept of an array and how arrays are represented in the Java language. Then you’ll learn about one-dimensional arrays and the three ways that you can introduce them to your Java programs. Finally, we’ll explore five algorithms used to search and sort one-dimensional arrays.

Note that this article builds on Datastructures and algorithms, Part 1, which introduces the theoretical side of datastructures and the algorithms associated with them. That article includes an in-depth discussion of algorithms and how to use space and time complexity factors to evaluate and select the most efficient algorithm for your Java program. This article will be much more hands-on, and assumes you have already read and digested Part 1.

To read this article in full, please click here

Source: New feed

ZGC large-heap Java garbage collector may go open source

An Oracle-developed, low-latency Java garbage collector geared to large heaps could move to the open source community, if a proposal to do so gets community approval. Votes are due by November 8.

Called the Z Garbage Collector (ZGC), the project is designed to support multiterabyte heaps, have pause times not exceeding 10 milliseconds, and offer no more than a 15 percent application reduction throughput compared to the G1 garbage collector.

But ZGC’s developers don’t see these goals as “hard requirements” for every workload, according to a proposal floated on an OpenJDK mailing list by Per Liden, a member of the HotSpot virtual machine team at Oracle. Liden’s proposal calls for creation of a ZGC project that he would lead, with the HotSpot group as sponsor. 

To read this article in full, please click here

Source: New feed

Machine learning for Java developers

Self-driving cars, face detection software, and voice controlled speakers all are built on machine learning technologies and frameworks–and these are just the first wave. Over the next decade, a new generation of products will transform our world, initiating new approaches to software development and the applications and products that we create and use.

As a Java developer, you want to get ahead of this curve now–when tech companies are beginning to seriously invest in machine learning. What you learn today, you can build on over the next five years, but you have to start somewhere.

This article will get you started. You will begin with a first impression of how machine learning works, followed by a short guide to implementing and training a machine learning algorithm. After studying the internals of the learning algorithm and features that you can use to train, score, and select the best-fitting prediction function, you’ll get an overview of using a JVM framework, Weka, to build machine learning solutions. This article focuses on supervised machine learning, which is the most common approach to developing intelligent applications.

To read this article in full, please click here

Source: New feed

Serverless computing with AWS Lambda

Serverless computing may be the hottest thing in cloud computing today, but what, exactly, is it? In this two-part article you’ll get started with serverless computing–from what it is, to why it’s considered disruptive to traditional cloud computing, and how you might find yourself using it in Java-based programming. Following the overview, you’ll get a tutorial introduction to AWS Lambda, which is considered by many the premiere Java-based solution for serverless computing today. In Part 1, you’ll use AWS Lambda to build your first serverless function in Java. In Part 2, you’ll integrate your Lambda functions with DynamoDB, then use the AWS SDK to invoke Lambda functions in a Java application.

To read this article in full, please click here

Source: New feed