7KO4BAOG5HYRSQOFMJTT3AD4K4OPOICXTTVO7K3UPPA24TACMLUAC QA2TJZRA7RYN5UMRDLXWASQBU7YXV63R2A33EVJME6FV4S67WY3AC NV7FXZ5QETWHE7EQHET5ZZUKH4UIAIRGQ42MR2IT5JCZDPRNEZRQC PHGT4YMBYADGJLIVX2OOIY4YV7SHLIGPDXM6XXSQ7I4MH2SQURIAC L7G33K4C6EA6SZ6M2QN2KAXCSOQRP3VFCZNIC5LO4MXXKAB5EYHAC 2UBDFCJH2BG6U6SY2YDJ7QK4JOLUJAHOYZS3YRQ7E7U4UGP4YR5QC YODTMMPTZOUTK2OQWHDJB6D3QXZ2FJFYH2XPICCOITCK54EARC2AC PQ6OQCBQUJMAGTITCWSBFIKCGW3BO7ZOBQ2AB7DQOIUNZH2KICYAC BOPNWZL4RWF4UGC2LUEVONNQFSKYJ2Q5W747GH3UURZLBJFFEBUQC U46LDPL7W76Z62KUTGA6QNB27C7HGBFUXT6E7V5X4ACTDUBTKAKAC 3FAESP6NBZJHKN6VSGENUMZXPID6O5CGNQ3NHJC5Z3YXTAXJVWVAC RQ56K2G4ZCSZLOXCAPZFWAPW7PC7KTGJXUYD6MFOK4P6DQ2JYKGAC EDYR5C55YKPEMJOS4O6YEUK5JYUWSX4NTPQGG4GLH5QL2O62GTPQC JYCKLP2EZJHX7QWMGN6TZZLRXOG7A4LUO64X2ORWDMSJX7LM7XIQC ROFI4OLA7JABVWRTJTZQ5GHKYLSO5QKFUDQHWHWP7OEYTGNBCLPQC X54TLSYE7DYWGLQUI7AIZOABUCQ7WTHFC6G62UBA7RJZK4AYOJGAC OFTU77S5FEE5SWAV7G4OWAQ7GY6LOEY65ZTMJCDMPMJGB7TWDIOQC NMQCXLNGPIKUKMNKEIVUBYUNJQDM63IVKRVMQHVV5TLH6OBNF3IAC E3Y55MPRKKDPTGI56RSA7YCGB33NSZYKHGCVHEUKRM2KJ2RNM5IQC QPBH7QWC56HIYUJCCHKDNBN2D346UN3IOXDCIKSAATLIATLSW3YQC FY7CEMM2ZSPOL62BX2JR3VBGUTAWEPIJ26AUT4U3GHTWWCAZZY2AC export check_png_wav_both_present, resize_image!, move_one_hour!, utc_to_nzdt!
export dawn_dusk_of_sunrise_sunset,get_sunrise_sunset_utc,make_spectro_from_file,move_one_hour!,resample_to_16000hz,resample_to_8000hz,resize_image!,utc_to_nzdt!,check_png_wav_both_present
end"""twilight_tuple_local_time(dt::Date)#deprecated now have a big sunset_sunrise_utc.csv to work fromTakes a date and returns a tuple with local time twilight times. Use to make a Dataframe then csv.Queries api.sunrise-sunset.orgwas using civil_twilight_end, civil_twilight_begin, changed to sunrise, sunsetUse like this:Using CSV, Dates, DataFrames, Skraakdf = DataFrames.DataFrame(Date=[], Dawn=[], Dusk=[])dr = Dates.Date(2019,01,01):Dates.Day(1):Dates.Date(2024,12,31)for day in drq = Skraak.twilight_tuple_local_time(day)isempty(q) ? println("fail $day") : push!(df, q)sleep(5)endCSV.write("dawn_dusk.csv", df)using CSV, DataFrames, Dates, HTTP, JSON3, TimeZones#####deprecated now have a big sunset_sunrise_utc.csv to work from"""function twilight_tuple_local_time(dt::Date)# C05 co-ordinates hard coded into functionresp1 = HTTP.get("https://api.sunrise-sunset.org/json?lat=-45.50608&lng=167.47822&date=$dt&formatted=0",)#resp2 = String(resp1.body) |> JSON.Parser.parseresp2 = String(resp1.body) |> x -> JSON3.read(x)resp3 = get(resp2, "results", "missing")dusk_utc = get(resp3, "sunset", "missing")dusk_utc_zoned = ZonedDateTime(dusk_utc, "yyyy-mm-ddTHH:MM:SSzzzz")dusk_local = astimezone(dusk_utc_zoned, tz"Pacific/Auckland")dusk_string = Dates.format(dusk_local, "yyyy-mm-ddTHH:MM:SS")dawn_utc = get(resp3, "sunrise", "missing")dawn_utc_zoned = ZonedDateTime(dawn_utc, "yyyy-mm-ddTHH:MM:SSzzzz")dawn_local = astimezone(dawn_utc_zoned, tz"Pacific/Auckland")dawn_string = Dates.format(dawn_local, "yyyy-mm-ddTHH:MM:SS")date = Dates.format(dt, "yyyy-mm-dd")return (date, dawn_string, dusk_string)
=#function make_spectro_from_file(file::String)signal, freq = WAV.wavread("$file")freq = freq |> Float32partitioned_signal = Iterators.partition(signal, 80000) #5s clipsfor (index, part) in enumerate(partitioned_signal)length(part) > 50000 && beginoutfile = "$(chop(file, head=0, tail=4))__$(index)"image = Skraak.get_image_from_sample(part, freq)PNGFiles.save("$outfile.png", image)endendend
export aggregate_labels, audiodata_db, raven_of_avianz, avianz_of_raven, label_summary
export actual_from_folders,aggregate_labels,audiodata_db,avianz_file_of_dict,avianz_of_raven,check_change_avianz_species!,df_of_avianz_dict,dict_of_avianz_file,label_summary,prepare_df_for_raven,raven_of_avianz,raven_of_avianz
This function takes the csv output from my hand classification and ouputs a df, and csv for insertion into AudioData.duckdb using the duckdb cli or using DFto.audiodata_db()
This function takes the csv output from my hand classification and ouputs a df, and csv for insertion into AudioData.duckdb using the duckdb cli or using audiodata_db()
function move_files(input_file::String,output_path::String)
function move_files(input_file::String, output_path::String)
PNGFiles.save("$output_path$folder/K/$folder-$start-$(start+4).png",plot,)
PNGFiles.save("$output_path$folder/K/$folder-$start-$(start+4).png", plot)
PNGFiles.save("$output_path$folder/N/$folder-$start-$(start+4).png",plot,)
PNGFiles.save("$output_path$folder/N/$folder-$start-$(start+4).png", plot)
endendend# Convert mp3's with: for file in *.mp3; do ffmpeg -i "${file}" -ar 16000 "${file%.*}.wav"; done# Requires 16000hz wav's, works in current folder, need ffmpeg to convert mp3's to wavs at 16000hz#=wavs = Glob.glob("*.wav")for wav in wavsSkraak.make_spectro_from_file(wav)end=#function make_spectro_from_file(file::String)signal, freq = WAV.wavread("$file")freq = freq |> Float32partitioned_signal = Iterators.partition(signal, 80000) #5s clipsfor (index, part) in enumerate(partitioned_signal)length(part) > 50000 && beginoutfile = "$(chop(file, head=0, tail=4))__$(index)"image = Skraak.get_image_from_sample(part, freq)PNGFiles.save("$outfile.png", image)
using DSP, GLMakie, PNGFilesfunction get_colour_image_from_sample(sample, f)dims = 224 #pxS = DSP.spectrogram(sample[:, 1], 400, 2; fs = f)f = GLMakie.Figure(resolution = (dims, dims), figure_padding = 0)ax = GLMakie.Axis(f[1, 1], spinewidth = 0)GLMakie.hidedecorations!(ax)GLMakie.heatmap!(ax, (DSP.pow2db.(S.power))', colormap = :inferno)
a=Glob.glob("*/*/preds-2024-08-29.csv")b=Glob.glob("*/*/*/preds-2024-08-29.csv")c=Glob.glob("*/*/*/*/preds-2024-08-29.csv")predictions = [a ; b ; c]## orpredictions = Glob.glob("*/*/preds-2024-10-21.csv")
#=For Kahurangi Dataa=Glob.glob("*/*/*.csv")b=Glob.glob("*/*/*/*.csv")c=Glob.glob("*/*/*/*/*.csv")list=[a ; b ; c]# to delete empty preds.csv filesfor file in listsize = stat(file).sizeif size < 10println("Deleting $file - $size")rm(file)endend## change date on preds below (no longer required at 25/10/24)function make_clips_kahurangi(preds_path::String, label::Int = 1)
function make_clips_generic(preds_path::String, label::Int = 1)
using Glob, Skraak, CSV, DataFrames, Dates, PNGFilesa=Glob.glob("*/*/preds-2024-08-29.csv")b=Glob.glob("*/*/*/preds-2024-08-29.csv")c=Glob.glob("*/*/*/*/preds-2024-08-29.csv")predictions = [a ; b ; c]## orpredictions = Glob.glob("*/*/preds-2024-10-21.csv")for file in predictionstrymake_clips_kahurangi(file)catch xprintln(x)endend=#
deps = ["AxisAlgorithms", "ChainRulesCore", "LinearAlgebra", "OffsetArrays", "Random", "Ratios", "Requires", "SharedArrays", "SparseArrays", "StaticArrays", "WoodburyMatrices"]git-tree-sha1 = "00a19d6ab0cbdea2978fc23c5a6482e02c192501"
deps = ["Adapt", "AxisAlgorithms", "ChainRulesCore", "LinearAlgebra", "OffsetArrays", "Random", "Ratios", "Requires", "SharedArrays", "SparseArrays", "StaticArrays", "WoodburyMatrices"]git-tree-sha1 = "721ec2cf720536ad005cb38f50dbba7b02419a15"
deps = ["Artifacts", "Dates", "Downloads", "FileWatching", "LibGit2", "Libdl", "Logging", "Markdown", "Printf", "REPL", "Random", "SHA", "Serialization", "TOML", "Tar", "UUIDs", "p7zip_jll"]
deps = ["Artifacts", "Dates", "Downloads", "FileWatching", "LibGit2", "Libdl", "Logging", "Markdown", "Printf", "Random", "SHA", "TOML", "Tar", "UUIDs", "p7zip_jll"]