Posts

Showing posts with the label Java

All advanced sorting techniques using Java 8 and Streams

package com.dpq.interview.Q; import java.math.BigInteger; import java.text.ParseException; import java.text.SimpleDateFormat; import java.util.ArrayList; import java.util.Collections; import java.util.Comparator; import java.util.Date; import java.util.List; public class SortingDemo { public static void main(String[] args) throws ParseException { allSortingTechniquesAfterJava8(); } public static void allSortingTechniquesAfterJava8() throws ParseException { List<Employee> list = populateList(); System.out.println(“####################################### Natiral soring by Employee Id ###############################################”); list.stream().sorted().forEach(e -> System.out.println(e)); System.out.println(“####################################### Natiral soring by Employee Id but in desending order ###############################################”); list.stream().sorted(Collections.reverseOrder()).forEach(e -> System.out.println(e)); List<Employee> sortedList = list

Java 8 - complex programs asked in interviews

package com . dpq . movie . ratings . datalayer . dao . impl ; import java . util . Arrays ; import java . util . List ; import java . util . stream . Collectors ; public class RatingDaoImpl { publicstaticvoid main ( String [] args ) { List < String > names = Arrays . asList ( “Deepak” , “Deepak” , “Kuhu” , “Kuhu” , “Garv” ) ; System . out . println ( names . stream () . collect ( Collectors . toMap ( k -> k , v -> 1 , Integer :: sum ))) ; List < Student > student = Arrays . asList ( new RatingDaoImpl () . new Student ( “Math” , 98 ) ,   new RatingDaoImpl () . new Student ( “Science” , 98 ) , new RatingDaoImpl () . new Student ( “English” , 98 ) , new RatingDaoImpl () . new Student ( “Hindi” , 98 )) ; System . out . println ( student . stream () . map ( e -> e . getMarks ()) . collect ( Collectors . summingInt ( e -> e . intValue ()))) ; List < StudentDetails > studentDetails = Arrays . asList ( new RatingDaoImpl () . new Stud

Everything about Binary Tree and its all traversal techniques (recursive and itterative) with examples

package org.dpq.ds.tree;import java.util.Stack;public class Tree<T>{ public static void main(String[] args) { TreeNode<Integer> root = new TreeNode<Integer>(1); root.setLeft(new TreeNode<Integer>(2)); root.setRight(new TreeNode<Integer>(3)); root.getLeft().setLeft(new TreeNode<Integer>(4)); root.getLeft().setRight(new TreeNode<Integer>(5)); root.getRight().setLeft(new TreeNode<Integer>(6)); root.getRight().setRight(new TreeNode<Integer>(7)); Tree<Integer> tree = new Tree<Integer>(); //Tree// 1// / \// 2 3// /\ /\// 4 5 6 7 //expected result for inorder(LNR) 4 2 5 1 6 3 7 //expected result for preorder(NLR) 1 2 4 5 3 6 7 //expected result for preorder(NLR) 4 5 2 6 7 3 1 System.out.println("recursive inorder \n"); tree.inOrder(root); System.out.println("recursive pre

Rest API simple Application with http mothods

We will be developing REST API using JAX-RS (Jersey) and Tomcat server and we will be implementing basic 4 methods so lets get started Here is pom.xml file: <project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"><modelVersion>4.0.0</modelVersion><groupId>org.dpq.webservices</groupId><artifactId>SimpleRestApiApp</artifactId><packaging>war</packaging><version>0.0.1-SNAPSHOT</version><name>SimpleRestApiApp</name><build> <finalName>SimpleRestApiApp</finalName> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.8.1&

Rest API understanding with its Architecture

REST is an acronym for  RE presentational  S tate  T ransfer and an architectural style for  distributed hypermedia systems . 1. Principles of REST:     Uniform Interface: The following four constraints can achieve a uniform REST interface: Identification of resources  – The interface must uniquely identify each resource involved in the interaction between the client and the server. Manipulation of resources through representations  – The resources should have uniform representations in the server response. API consumers should use these representations to modify the resources state in the server. Self-descriptive messages  – Each resource representation should carry enough information to describe how to process the message. It should also provide information of the additional actions that the client can perform on the resource. Hypermedia as the engine of application state  – The client should have only the initial URI of the application. The client application should dynamically driv

United Kingdom Sponsor Data Analysis with Java Streams and Parallel Streams and comparisons

United Kingdom Sponsor Data Analysis with Java Streams and Parallel Streams and comparisons Things which are covered: Reading CSV data using CSVReader check count preparing List of objects to process evaluating range check checking whether a given company is registered or not processing above point by streams and parallel streams comparing times b/w Streams and parallel streams inconsistency in parallel streams Inconsistency in parallel streams: Here while searching element in records which returned from parallelstream objects don’t give guarantee because records are splits to be processed in parallel and while comparing one chunk of data is beings considered to compare, that’s why there is inconsistency in parallel streams and how Spark will handle this with minimum code with consistency we will see in upcoming post Later we will compare the same analysis with Spark and see differences in execution and with respect to time comoplexity Implementation: package com . dpq . sponsors . dr

Covid Data Analysis with Bed availability and other details

Covid Data Analysis: I have provided small sample dataset and run same progam with 10 GB data on cluster with 10 mappers and it took around 25 secs to process data We have added Partioner just to understand how partition is partiioning data and mapper is being assigned to process that particular partition Implemented cache for performance booster Country wise total cases Country wise new cases Country wise other details like available beds, booster details etc for more details please follow below git details: https://github.com/Deepak-Bhardwaj-Architect/CovidDataAnalysis Implementation: package com.citi.covid.spark.driver; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.Dataset; import org.apache.spark.sql.Row; import org.apache.spark.sql.SparkSession; public class CovidDataAnalysis { public static void main(String[] args) throws InterruptedException { JavaSparkContext sc = new JavaSparkContext(new SparkConf().setAppName

Model Data Anylysis Using DataFrame and SparkSQL in java

Here we will analyze Model data using pure Spark SQL, Data Frame and will use mostly used methods with sample data package com.dpq.model.data.driver; import java.util.Arrays; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.sql.Dataset; import org.apache.spark.sql.Row; import org.apache.spark.sql.SparkSession; public class ModelDataAnalysis { public static void main(String[] args) throws InterruptedException { JavaSparkContext sc = new JavaSparkContext(new SparkConf().setAppName(“Spark Count”).setMaster(“local”)); SparkSession spark = SparkSession.builder().appName(“spark-bigquery-demo”).getOrCreate(); Dataset<Row> row = spark.read().csv(“/Users/dpq/springbootWrokspace/CountryDataAnalysis/resources/modeloutput.csv”); // way 1 to change column name row = row.withColumnRenamed(“_c0”, “CountryName”); row = row.withColumnRenamed(“_c1”, “ReportingPurpuse”); row = row.withColumnRenamed(“_c2”, “Quarter”); row = row.withColumnR

Country Risk Data Analysis by Spark with Dataset

Here we will analyse country risk data and we will do some manipulation snd to do manipulation we will apply fxrate data to this dataset and to achieve performance benefits we will use broadcast variable Sample Data: AU,,2017Q1,Account,100.1020,2000.1040 KR,,2017Q1,Account,100.1020,2000.1040 US,,2017Q1,Account,100.1020,2000.1040 AU,,2018Q1,Account,100.1020,2000.1040 US,,2018Q1,Account,100.1020,2000.1040 AU,,2019Q1,Account,100.1020,2000.1040 KR,,2019Q1,Account,100.1020,2000.1040 AU,,2016Q1,Account,100.1020,2000.1040 KR,,2016Q1,Account,100.1020,2000.1040 AU,,2017Q1,Segment,100.1020,2000.1040 AU,,2017Q1,Segment,100.1020,2000.1040 US,,2017Q1,Account,100.1020,2000.1040 package com . dpq . country . data . driver ; import java . io . Serializable ; import java . math . BigDecimal ; import java . util . HashMap ; import java . util . Iterator ; import java . util . LinkedList ; import java . util . List ; import java . util . Map ; import org . apache . spark . SparkConf ; import or