Announcement

Collapse
No announcement yet.

Ideas for managing data in a scheduled task

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Bruce - PhosphorMedia
    replied
    maybe missing something, but couldn't the process just be:

    Run Initial Loop Up To Size Limit
    Write File
    Send File
    Delete File
    Repeat.

    And yes, I'm on drugs from minor surgery...but at least you might get a laugh from this idea.

    Leave a comment:


  • ids
    replied
    The current plan is for the tasks to run is once per week. So running one scheduled task means tracking at least 2 timelines. The weekly timeline and a timeline that handles functions that process the multiple JSON files queued up.

    The question isn't about waiting to handle when the callback is received. My question is how to manage the labeling of the timelines. Seems that using the filesystem to create and maintain files as flags for status, etc; a flag for knowing a new status exists to trigger the next request and also some flag to indicate which file or process is the next request?

    Seems to be shaking out where the scheduled task runs frequently -- say every 6 hours instead of just once per week. It will track the weekly iteration. If complete, nothing else runs until the next "week." If the week isn't finished, then it iterates through the multiple files. Basically, this appears to want to run like a Foreach or While loop, with at least one inner loop, but using filenames as the variables? What file naming or naming or file content might be effective?

    Still inviting suggestions.

    Scott

    Leave a comment:


  • Kent Multer
    replied
    Well, aside from a minor grammatical quibble about a "required optional method" ... :^)

    I think this can be done with a single scheduled task Before the task sends a new request, it just has to check whether a callback has been received for the last request.

    Leave a comment:


  • ids
    started a topic Ideas for managing data in a scheduled task

    Ideas for managing data in a scheduled task

    What kind of procedures would you use?

    I have a custom module in development where I am sending a lot of data as a JSON string via REST/HTTP protocol. There are two obstacles. First, the service that is called has a fixed limit for how much data can be sent. Second, when running manually ( a required optional method) from an admin screen can't reach that limit without a timeout anyway. Because of the amount of data, it may take 1, 2, 3,...n iterations to send all the data.

    Additionally, iterations past the first cannot be sent until a callback with a "completion" message is received.

    I've discussed the limitations with the service and without being able to monitor the response effectively, waiting to send the next iteration will give the service enough time to process the current iteration and issue the response that it completed. The current suggestion is one iteration every 24 hours. Generally, that seems to suggest to break down the data to separate files of the JSON string. The Module is already doing that anyway.

    I'm looking for suggestions to actually manage this unattended in a scheduled task, or multiple scheduled tasks.

    Thanks,

    Scott
Working...
X