Examples of initialize()

  • org.xtreemfs.babudb.sandbox.RandomGenerator.initialize()
    Generates the static meta-operations scenario. @param seed - has to be the same at every BabuDB. @return the operations scenario.
  • pdp.scrabble.game.Player.initialize()
  • persistence.antlr.collections.AST.initialize()
  • ptolemy.actor.Director.initialize()
    Initialize the model controlled by this director. Set the current time to the start time or the current time of the executive director, and then invoke the initialize() method of this director on each actor that is controlled by this director. If the container is not an instance of CompositeActor, do nothing. This method should typically be invoked once per execution, after the preinitialization phase, but before any iteration. It may be invoked in the middle of an execution, if reinitialization is desired. Since type resolution has been completed and the current time is set, the initialize() method of a contained actor may produce output or schedule events. If stop() is called during this methods execution, then stop initializing actors immediately. This method is not synchronized on the workspace, so the caller should be. @exception IllegalActionException If the initialize() method ofone of the associated actors throws it.
  • ptolemy.actor.Manager.initialize()
    Initialize the model. This calls the preinitialize() method of the container, followed by the resolveTypes() and initialize() methods. Set the Manager's state to PREINITIALIZING and INITIALIZING as appropriate. This method is read synchronized on the workspace. @exception KernelException If the model throws it. @exception IllegalActionException If the model is already running, orif there is no container.
  • ptolemy.copernicus.kernel.GeneratorAttribute.initialize()
    If this GeneratorAttribute has not yet been initialized, the initialized it by reading the moml file named by the initialParametersURL and creating Parameters and Variables accordingly.
  • ptolemy.distributed.common.DistributedActor.initialize()
    Begin execution of the actor. @exception RemoteException If a communication-related exception mayoccur during the execution of a remote method call.
  • ptolemy.graph.InequalityTerm.initialize()
    Initialize the value of this term to the specified CPO element. If this InequalityTerm is a simple variable that can be set to any CPO element, set the value of the variable to the specified argument. In this case, this method is equivalent to setValue() with the same argument. In some applications, this term is a structured object that only part of it is a simple variable. In this case, set that variable part to the specified argument. @param e An Object representing an element in the underlying CPO. @exception IllegalActionException If this term is not a variable.
  • q_impress.pmi.lib.services.loadsave.LoadingService.initialize()
  • q_impress.pmi.lib.services.loadsave.SavingService.initialize()
  • rs.frenjoynet.core.core.BaseFacade.initialize()
  • shapes.OtherClass.initialize()
    initialize

  • share.LogWriter.initialize()
    initialization
  • sicel.compiler.parser.Lexer.initialize()
  • sos.stresstest.dialogtest.SOSDialogTest.initialize()
  • structures.Network.initialize()
    Initializes individual columns within the entire Network.
  • sun.security.pkcs11.Secmod.initialize()
    Initialize this Secmod. @param configDir the directory containing the NSS configurationfiles such as secmod.db @param nssLibDir the directory containing the NSS libraries(libnss3.so or nss3.dll) or null if the library is on the system default shared library path @throws IOException if NSS has already been initialized,the specified directories are invalid, or initialization fails for any other reason
  • testrf.shared.TestRequestFactory.initialize()
  • uptimemart.Database.initialize()
    Creates the tables if the table does not already exist

    This method determines whether the SERVERS database has been created, if not create it.

  • us.b3k.kafka.ws.transforms.Transform.initialize()
  • vsv.VSVModel.initialize()
  • waffle.windows.auth.IWindowsCredentialsHandle.initialize()
    Initialize.
  • waffle.windows.auth.impl.WindowsSecurityContextImpl.initialize()
  • weka.experiment.InstanceQuery.initialize()
  • wpn.hdri.ss.engine.EngineInitializer.initialize()
  • xbird.xquery.func.constructor.CastingFunction.initialize()

  • Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFDateSub.initialize()

        ObjectInspector valueOI1 = PrimitiveObjectInspectorFactory.writableDateObjectInspector;
        ObjectInspector valueOI2 = PrimitiveObjectInspectorFactory.javaIntObjectInspector;
        ObjectInspector[] arguments = {valueOI1, valueOI2};


        udf.initialize(arguments);
        DeferredObject valueObj1 = new DeferredJavaObject(new DateWritable(new Date(109, 06, 20)));
        DeferredObject valueObj2 = new DeferredJavaObject(new Integer("4"));
        DeferredObject[] args = {valueObj1, valueObj2};
        Text output = (Text) udf.evaluate(args);
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFHash.initialize()

                    objectInspectors[i] = getJavaObjectInspector(entry.getKey());
                    deferredObjects[i] = getJavaDeferredObject(entry.getValue(), entry.getKey());
                    i++;
                }

                ObjectInspector udfInspector = udf.initialize(objectInspectors);
                checkArgument(udfInspector instanceof IntObjectInspector, "expected IntObjectInspector: %s", udfInspector);
                IntObjectInspector inspector = (IntObjectInspector) udfInspector;

                Object result = udf.evaluate(deferredObjects);
                HiveKey hiveKey = new HiveKey();
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFLTrim.initialize()

      public void testTrim() throws HiveException {
        GenericUDFLTrim udf = new GenericUDFLTrim();
        ObjectInspector valueOI = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
        ObjectInspector[] arguments = { valueOI };

        udf.initialize(arguments);
        runAndVerify(" Hello World! ", "Hello World! ", udf);
        runAndVerify("Hello World! ", "Hello World! ", udf);
        runAndVerify(" Hello World!", "Hello World!", udf);
        runAndVerify("Hello World!", "Hello World!", udf);
        runAndVerify("   ", "", udf);
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFLpad.initialize()

        ObjectInspector valueOI1 = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
        ObjectInspector valueOI2 = PrimitiveObjectInspectorFactory.writableIntObjectInspector;
        ObjectInspector valueOI3 = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
        ObjectInspector[] arguments = { valueOI1, valueOI2, valueOI3 };

        udf.initialize(arguments);
        runAndVerify("hi", 5, "??", "???hi", udf);
        runAndVerify("hi", 1, "??", "h", udf);
      }

      private void runAndVerify(String str, int len, String pad, String expResult, GenericUDF udf)
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFRTrim.initialize()

      public void testTrim() throws HiveException {
        GenericUDFRTrim udf = new GenericUDFRTrim();
        ObjectInspector valueOI = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
        ObjectInspector[] arguments = { valueOI };

        udf.initialize(arguments);
        runAndVerify(" Hello World! ", " Hello World!", udf);
        runAndVerify("Hello World! ", "Hello World!", udf);
        runAndVerify(" Hello World!", " Hello World!", udf);
        runAndVerify("Hello World!", "Hello World!", udf);
        runAndVerify("   ", "", udf);
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.generic.GenericUDFTrim.initialize()

      public void testTrim() throws HiveException {
        GenericUDFTrim udf = new GenericUDFTrim();
        ObjectInspector valueOI = PrimitiveObjectInspectorFactory.writableStringObjectInspector;
        ObjectInspector[] arguments = { valueOI };

        udf.initialize(arguments);
        runAndVerify(" Hello World! ", "Hello World!", udf);
        runAndVerify("Hello World! ", "Hello World!", udf);
        runAndVerify(" Hello World!", "Hello World!", udf);
        runAndVerify("Hello World!", "Hello World!", udf);
        runAndVerify("   ", "", udf);
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.ptf.TableFunctionResolver.initialize()

            }
            def.addArg(argDef);
          }
        }

        tFn.initialize(hCfg, ptfDesc, def);
        TableFunctionEvaluator tEval = tFn.getEvaluator();
        def.setTFunction(tEval);
        def.setCarryForwardNames(tFn.carryForwardNames());
        tFn.setupRawInputOI();
    View Full Code Here

    Examples of org.apache.hadoop.hive.ql.udf.ptf.WindowingTableFunction.WindowingTableFunctionResolver.initialize()

        wdwTFnDef.setName(FunctionRegistry.WINDOWING_TABLE_FUNCTION);
        wdwTFnDef.setResolverClassName(tFn.getClass().getName());
        wdwTFnDef.setAlias("ptf_" + 1);
        wdwTFnDef.setExpressionTreeString(null);
        wdwTFnDef.setTransformsRawInput(false);
        tFn.initialize(hCfg, ptfDesc, wdwTFnDef);
        TableFunctionEvaluator tEval = tFn.getEvaluator();
        wdwTFnDef.setTFunction(tEval);
        wdwTFnDef.setCarryForwardNames(tFn.carryForwardNames());
        wdwTFnDef.setRawInputShape(inpShape);
    View Full Code Here

    Examples of org.apache.hadoop.hive.serde2.AbstractSerDe.initialize()

        Path file = null;
        // the SerDe part is from TestLazySimpleSerDe
        AbstractSerDe serDe = new ColumnarSerDe();
        // Create the SerDe
        Properties tbl = createProperties();
        serDe.initialize(conf, tbl);

        String usage = "Usage: RCFile " + "[-count N]" + " file";
        if (args.length == 0) {
          System.err.println(usage);
          System.exit(-1);
    View Full Code Here

    Examples of org.apache.hadoop.hive.serde2.Deserializer.initialize()

        String tableName = String.valueOf(tblProps.getProperty("name"));
        String partName = String.valueOf(partSpec);
        // HiveConf.setVar(hconf, HiveConf.ConfVars.HIVETABLENAME, tableName);
        // HiveConf.setVar(hconf, HiveConf.ConfVars.HIVEPARTITIONNAME, partName);
        Deserializer deserializer = (Deserializer) sdclass.newInstance();
        deserializer.initialize(hconf, tblProps);
        StructObjectInspector rawRowObjectInspector = (StructObjectInspector) deserializer
            .getObjectInspector();

        MapOpCtx opCtx = null;
        // Next check if this table has partitions and if so
    View Full Code Here
    TOP
    Copyright © 2018 www.massapi.com. All rights reserved.
    All source code are property of their respective owners. Java is a trademark of Sun Microsystems, Inc and owned by ORACLE Inc. Contact coftware#gmail.com.