mercredi 15 juillet 2015

Java Performance issue compared to Perl

I have written a perl code to process huge no of CSV files and get output,which is taking 0.8326 ms to complete.

my $opname = $ARGV[0];
my @files = `find . -name "*${opname}*.csv";mtime -10 -type f`;
my %hash;
foreach my $file (@files) {
chomp $file;
my $time = $file;
$time =~ s/.*\~(.*?)\..*/$1/;

open(IN, $file) or print "Can't open $file\n";
while (<IN>) {
    my $line = $_;
    chomp $line;

    my $severity = (split(",", $line))[6];
    next if $severity =~ m/NORMAL/i;
    $hash{$time}{$severity}++;
}
close(IN);

}
foreach my $time (sort {$b <=> $a} keys %hash) {
    foreach my $severity ( keys %{$hash{$time}} ) {
        print $time . ',' . $severity . ',' . $hash{$time}{$severity} . "\n";
    }
}

Now I'M writing the same logic in Java, which I wrote but taking 2.6ms to complete , My question is why java is taking so long time??How to achieve the same speed as perl?? Note: I ignored the VM initialization and class loading time.

    import java.io.BufferedReader;
    import java.io.File;
    import java.io.FileFilter;
    import java.io.FileReader;
    import java.io.IOException;
    import java.util.HashMap;
    import java.util.Map;
    import java.util.TreeMap;

    public class MonitoringFileReader {
        static Map<String, Map<String,Integer>> store= new TreeMap<String, Map<String,Integer>>(); 
        static String opname;
        public static void testRead(String filepath) throws IOException
        {
            File file = new File(filepath);

            FileFilter fileFilter= new FileFilter() {

                @Override
                public boolean accept(File pathname) {
                    // TODO Auto-generated method stub
                    int timediffinhr=(int) ((System.currentTimeMillis()-pathname.lastModified())/86400000);
                    if(timediffinhr<10 && pathname.getName().endsWith(".csv")&& pathname.getName().contains(opname)){
                        return true;
                        }
                    else
                        return false;
                }
            };

            File[] listoffiles= file.listFiles(fileFilter);
        long time= System.currentTimeMillis();  
            for(File mf:listoffiles){
                String timestamp=mf.getName().split("~")[5].replace(".csv", "");
                BufferedReader br= new BufferedReader(new FileReader(mf),1024*500);
                String line;
                Map<String,Integer> tmp=store.containsKey(timestamp)?store.get(timestamp):new HashMap<String, Integer>();
                while((line=br.readLine())!=null)
                {
                    String severity=line.split(",")[6];
                    if(!severity.equals("NORMAL"))
                    {
                        tmp.put(severity, tmp.containsKey(severity)?tmp.get(severity)+1:1);
                    }
                }
            store.put(timestamp, tmp);
            }
        time=System.currentTimeMillis()-time;
            System.out.println(time+"ms");  
            System.out.println(store);


        }

        public static void main(String[] args) throws IOException
        {
            opname = args[0];
            long time= System.currentTimeMillis();
            testRead("./SMF/data/analyser/archive");
            time=System.currentTimeMillis()-time;
            System.out.println(time+"ms");
        }

    }

File input format(A~B~C~D~E~20150715080000.csv),around 500 files of ~1MB each,

A,B,C,D,E,F,CRITICAL,G
A,B,C,D,E,F,NORMAL,G
A,B,C,D,E,F,INFO,G
A,B,C,D,E,F,MEDIUM,G
A,B,C,D,E,F,CRITICAL,G

Java Version: 1.7 Thanks !!

Aucun commentaire:

Enregistrer un commentaire