Examples of runJob()


Examples of org.apache.mahout.classifier.bayes.mapreduce.common.BayesWeightSummerDriver.runJob()

   
    log.info("Calculating weight sums for labels and features...");
    // Calculate the Sums of weights for each label, for each feature and for
    // each feature and for each label
    BayesWeightSummerDriver summer = new BayesWeightSummerDriver();
    summer.runJob(input, output, params);
   
    log.info("Calculating the weight Normalisation factor for each class...");
    // Calculate the normalization factor Sigma_W_ij for each complement class.
    BayesThetaNormalizerDriver normalizer = new BayesThetaNormalizerDriver();
    normalizer.runJob(input, output, params);
View Full Code Here

Examples of org.apache.mahout.classifier.bayes.mapreduce.common.BayesWeightSummerDriver.runJob()

   
    log.info("Calculating weight sums for labels and features...");
    // Calculate the Sums of weights for each label, for each feature and for
    // each feature and for each label
    BayesWeightSummerDriver summer = new BayesWeightSummerDriver();
    summer.runJob(input, output, params);
   
    log.info("Calculating the weight Normalisation factor for each complement class...");
    // Calculate the normalization factor Sigma_W_ij for each complement class.
    CBayesThetaNormalizerDriver normalizer = new CBayesThetaNormalizerDriver();
    normalizer.runJob(input, output, params);
View Full Code Here

Examples of org.apache.mahout.classifier.bayes.mapreduce.common.BayesWeightSummerDriver.runJob()

    tfidf.runJob(input, output, params);

    log.info("Calculating weight sums for labels and features...");
    //Calculate the Sums of weights for each label, for each feature and for each feature and for each label
    BayesWeightSummerDriver summer = new BayesWeightSummerDriver();
    summer.runJob(input, output, params);

    log.info("Calculating the weight Normalisation factor for each class...");
    //Calculate the normalization factor Sigma_W_ij for each complement class.
    BayesThetaNormalizerDriver normalizer = new BayesThetaNormalizerDriver();
    normalizer.runJob(input, output, params);
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

    int overshoot = (int) ((double) clusters * OVERSHOOT_MULTIPLIER);
    List<Double> eigenValues = new ArrayList<Double>(overshoot);
    Matrix eigenVectors = new DenseMatrix(overshoot, numDims);
    DistributedLanczosSolver solver = new DistributedLanczosSolver();
    Path lanczosSeqFiles = new Path(outputCalc, "eigenvectors-" + (System.nanoTime() & 0xFF));
    solver.runJob(conf,
                  L.getRowPath(),
                  new Path(outputTmp, "lanczos-" + (System.nanoTime() & 0xFF)),
                  L.numRows(),
                  L.numCols(),
                  true,
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

                                                               int overshoot,
                                                               List<Double> eigenValues,
                                                               Matrix eigenVectors, Path tmp) throws IOException {
    DistributedLanczosSolver solver = new DistributedLanczosSolver();
    Path seqFiles = new Path(tmp, "eigendecomp-" + (System.nanoTime() & 0xFF));
    solver.runJob(conf,
                  input.getRowPath(),
                  new Path(tmp, "lanczos-" + (System.nanoTime() & 0xFF)),
                  input.numRows(),
                  input.numCols(),
                  true,
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

                                                               int numEigenVectors,
                                                               int overshoot,
                                                               Path tmp) throws IOException {
    DistributedLanczosSolver solver = new DistributedLanczosSolver();
    Path seqFiles = new Path(tmp, "eigendecomp-" + (System.nanoTime() & 0xFF));
    solver.runJob(conf,
                  state,
                  overshoot,
                  true,
                  seqFiles.toString());
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

    // unnecessary vectors later
    int overshoot = (int) ((double) clusters * OVERSHOOT_MULTIPLIER);
    DistributedLanczosSolver solver = new DistributedLanczosSolver();
    LanczosState state = new LanczosState(L, numDims, solver.getInitialVector(L));
    Path lanczosSeqFiles = new Path(outputCalc, "eigenvectors-" + (System.nanoTime() & 0xFF));
    solver.runJob(conf,
                  state,
                  overshoot,
                  true,
                  lanczosSeqFiles.toString());

View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

                                                               int numEigenVectors,
                                                               int overshoot,
                                                               Path tmp) throws IOException {
    DistributedLanczosSolver solver = new DistributedLanczosSolver();
    Path seqFiles = new Path(tmp, "eigendecomp-" + (System.nanoTime() & 0xFF));
    solver.runJob(conf,
                  state,
                  overshoot,
                  true,
                  seqFiles.toString());
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.runJob()

      int overshoot = Math.min((int) (clusters * OVERSHOOTMULTIPLIER), numDims);
      DistributedLanczosSolver solver = new DistributedLanczosSolver();
      LanczosState state = new LanczosState(L, overshoot, DistributedLanczosSolver.getInitialVector(L));
      Path lanczosSeqFiles = new Path(outputCalc, "eigenvectors");

      solver.runJob(conf, state, overshoot, true, lanczosSeqFiles.toString());

      // perform a verification
      EigenVerificationJob verifier = new EigenVerificationJob();
      Path verifiedEigensPath = new Path(outputCalc, "eigenverifier");
      verifier.runJob(conf, lanczosSeqFiles, L.getRowPath(), verifiedEigensPath, true, 1.0, clusters);
View Full Code Here

Examples of org.apache.mahout.math.hadoop.decomposer.EigenVerificationJob.runJob()

                  lanczosSeqFiles.toString());

    // perform a verification
    EigenVerificationJob verifier = new EigenVerificationJob();
    Path verifiedEigensPath = new Path(outputCalc, "eigenverifier");
    verifier.runJob(conf, lanczosSeqFiles, L.getRowPath(), verifiedEigensPath, true, 1.0, 0.0, clusters);
    Path cleanedEigens = verifier.getCleanedEigensPath();
    DistributedRowMatrix W = new DistributedRowMatrix(cleanedEigens, new Path(cleanedEigens, "tmp"), clusters, numDims);
    W.configure(depConf);
    DistributedRowMatrix Wtrans = W.transpose();
    //    DistributedRowMatrix Wt = W.transpose();
View Full Code Here
TOP
Copyright © 2018 www.massapi.com. All rights reserved.
All source code are property of their respective owners. Java is a trademark of Sun Microsystems, Inc and owned by ORACLE Inc. Contact coftware#gmail.com.