Hello I'm a beginner in Elixir and I want to parse and stock a CSV file in an Elixir object.
But it's display that:
** (FunctionClauseError) no function clause matching in anonymous fn/1 in Siren.parseCSV/0
The following arguments were given to anonymous fn/1 in Siren.parseCSV/0:
# 1
["41", "5", "59", "N", "80", "39", "0", "W", "Youngstown", "OH"]
anonymous fn/1 in Siren.parseCSV/0
(elixir 1.10.3) lib/stream.ex:482: anonymous fn/4 in Stream.filter/2
(elixir 1.10.3) lib/stream.ex:1449: Stream.do_element_resource/6
(elixir 1.10.3) lib/stream.ex:1609: Enumerable.Stream.do_each/4
(elixir 1.10.3) lib/enum.ex:959: Enum.find/3
(mix 1.10.3) lib/mix/task.ex:330: Mix.Task.run_task/3
(mix 1.10.3) lib/mix/cli.ex:82: Mix.CLI.run_task/2
Here my code:
defmodule Siren do
def parseCSV do
IO.puts("Let's parse CSV file...")
File.stream!("../name.csv")
|> Stream.map(&String.trim(&1))
|> Stream.map(&String.split(&1, ","))
|> Stream.filter(fn
["LatD" | _] -> false
end)
|> Enum.find(fn State -> String
[LatD, LatM, LatS, NS, LonD, LonM, LonS, EW, City, State] ->
IO.puts("find -> #{State}")
true
end)
end
end
And the csv file:
LatD,LatM,LatS,NS,LonD,LonM,LonS,EW,City,State
41,5,59,N,80,39,0,W,Youngstown,OH
42,52,48,N,97,23,23,W,Yankton,SD
46,35,59,N,120,30,36,W,Yakima,WA
42,16,12,N,71,48,0,W,Worcester,MA
43,37,48,N,89,46,11,W,WisconsinDells,WI
36,5,59,N,80,15,0,W,Winston-Salem,NC
49,52,48,N,97,9,0,W,Winnipeg,MB
39,11,23,N,78,9,36,W,Winchester,VA
34,14,24,N,77,55,11,W,Wilmington,NC
39,45,0,N,75,33,0,W,Wilmington,DE
48,9,0,N,103,37,12,W,Williston,ND
41,15,0,N,77,0,0,W,Williamsport,PA
37,40,48,N,82,16,47,W,Williamson,WV
33,54,0,N,98,29,23,W,WichitaFalls,TX
37,41,23,N,97,20,23,W,Wichita,KS
40,4,11,N,80,43,12,W,Wheeling,WV
26,43,11,N,80,3,0,W,WestPalmBeach,FL
47,25,11,N,120,19,11,W,Wenatchee,WA
41,25,11,N,122,23,23,W,Weed,CA
The first issue is here:
|> Stream.filter(fn
["LatD" | _] -> false
end)
all the lines should pass this and the only first one matches the given clauses. This would fix the issue
|> Stream.filter(fn
["LatD" | _] -> false
_ -> true
end)
or
|> Stream.reject(&match?(["LatD" | _], &1))
Enum.find(fn State -> String after looks unclear and would be surely the next issue. I failed to understand what have you tried to achieve here.
The general advice would be: don’t reinvent the wheel and use NimbleCSV written by José Valim to parse CSVs, because there are lot of corner cases (like commas inside quotes in any field etc,) handled properly in the aforementioned library.
Aleksei Matiushkin gave you the right answer but also you have this function:
fn
State ->
String
[LatD, LatM, LatS, NS, LonD, LonM, LonS, EW, City, State] ->
IO.puts("find -> #{State}")
true
end
It accepts two possible values, either State which is an atom, or a list of 10 specific atoms.
What you want to do is use variables, and variables in Elixir start with a lowercase letter or an underscore if it has to be ignored.
fn
state ->
String
[latd, latm, lats, ns, lond, lonm, lons, ew, city, state] ->
IO.puts("find -> #{state}")
true
end
But in this case, the first clause of the function will always match anything because it acts like a catch-all clause.
What you probably want is:
fn
[_latd, _latm, _lats, _ns, _lond, _lonm, _lons, _ew, _city, state] ->
IO.puts("find -> #{state}")
# here decide if you want to return true or false,
# for instance `state == NC`
true
end
My JSON looks similar to this:
{ "items" :
[ { "type" : 0, "order": 10, "content": { "a" : 10, "b" : "description", ... } }
, { "type" : 1, "order": 11, "content": { "a" : 11, "b" : "same key, but different use", ... } }
, { "type" : 2, "order": 12, "content": { "c": "totally different fields", ... } }
...
]
}
and I want to use the type value to decide what union type to create while decoding. So, I defined alias types and decoders for all the above in elm :
import Json.Decode exposing (..)
import Json.Decode.Pipeline exposing (..)
type alias Type0Content = { a : Int, b : String }
type alias Type1Content = { a : Int, b2 : String }
type alias Type2Content = { c : String }
type Content = Type0 Type0Content | Type1 Type1Content | Type2 Type2Content
type alias Item = { order : Int, type : Int, content: Content }
decode0 = succeed Type0Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode1 = succeed Type1Content
|> requiredAt ["content", "a"] int
|> requiredAt ["content", "b"] string
decode2 = succeed Type2Content
|> requiredAt ["content", "c"] string
decodeContentByType hint =
case hint of
0 -> Type0 decode0
1 -> Type1 decode1
2 -> Type2 decode2
_ -> fail "unknown type"
decodeItem = succeed Item
|> required "order" int
|> required "type" int `andThen` decodeContentByType
Can't get the last two functions to interact as needed.
I've read through page 33 of json-survival-kit by Brian Thicks, but that didn't bring me on track either.
Any advice and lecture appreciated!
It looks like the book was written targeting Elm 0.17 or below. In Elm 0.18, the backtick syntax was removed. You will also need to use a different field name for type since it is a reserved word, so I'll rename it type_.
Some annotations might help narrow down bugs. Let's annotate decodeContentByType, because right now, the branches aren't returning the same type. The three successful values should be mapping the decoder onto the expected Content constructor:
decodeContentByType : Int -> Decoder Content
decodeContentByType hint =
case hint of
0 -> map Type0 decode0
1 -> map Type1 decode1
2 -> map Type2 decode2
_ -> fail "unknown type"
Now, to address the decodeItem function. We need three fields to satisfy the Item constructor. The second field is the type, which can be obtained via required "type" int, but the third field relies on the "type" value to deduce the correct constructor. We can use andThen (with pipeline syntax as of Elm 0.18) after fetching the Decoder Int value using Elm's field decoder:
decodeItem : Decoder Item
decodeItem = succeed Item
|> required "order" int
|> required "type" int
|> custom (field "type" int |> andThen decodeContentByType)
I'm trying to find URLs in a nested JSON response and map them. My function so far looks like this:
def list(env, id) do
Service.get_document(env, id)
|> Poison.decode!
|> Enum.find(fn {_key, val} -> String.starts_with?(val, 'https') end)
end
The JSON looks roughly like this:
"stacks": [
{
"boxes": [
{
"content": "https://ddd.cloudfront.net/photos/uploaded_images/000/001/610/original/1449447147677.jpg?1505956120",
"box": "photo"
}
]
}
],
"logo": "https://ddd.cloudfront.net/users/cmyk_banners/000/000/002/original/banner_CMYK.jpg?1397201875"
So URLs can have any key, and be at any level.
With that code I get this error:
no function clause matching in String.starts_with?/2
Anyone got a better way to find in JSON responses?
You'll have to use recursive function for this, which handles three types of data:
For map, it recurses over all its values.
For list, it recurses over all its elements.
For string, it selects strings that start with "https"
Here's a simple implementation which accepts a term and a string to check with starts_with?:
defmodule A do
def recursive_starts_with(thing, start, acc \\ [])
def recursive_starts_with(binary, start, acc) when is_binary(binary) do
if String.starts_with?(binary, start) do
[binary | acc]
else
acc
end
end
def recursive_starts_with(map, start, acc) when is_map(map) do
Enum.reduce(map, acc, fn {_, v}, acc -> A.recursive_starts_with(v, start, acc) end)
end
def recursive_starts_with(list, start, acc) when is_list(list) do
Enum.reduce(list, acc, fn v, acc -> A.recursive_starts_with(v, start, acc) end)
end
end
data = %{
"stacks" => [
%{
"boxes" => [
%{
"content" => "https://ddd.cloudfront.net/photos/uploaded_images/000/001/610/original/1449447147677.jpg?1505956120",
"box" => "photo"
}
]
}
],
"logo" => "https://ddd.cloudfront.net/users/cmyk_banners/000/000/002/original/banner_CMYK.jpg?1397201875"
}
data |> A.recursive_starts_with("https") |> IO.inspect
Output:
["https://ddd.cloudfront.net/photos/uploaded_images/000/001/610/original/1449447147677.jpg?1505956120",
"https://ddd.cloudfront.net/users/cmyk_banners/000/000/002/original/banner_CMYK.jpg?1397201875"]
I have a JSON file that I am trying to parse using Scala. I have figured out how to use Scala JSON parsing library to parse 1 entry in this format:
{"name":"John","number":"005","fav_colour":"blue"}
this is the code that works:
val result = JSON.parseFull("""{"name":"John","number":"005","fav_colour":"blue"}""")
result match {
case Some(e) => println(e)
case None => println("Failed.")
}
This prints Map(name -> John, number -> 005, fav_colour -> blue)
The code is based of of this: https://gist.github.com/takezoe/1540223
However, I am working with a file like this:
""" {"name":"John","number":"005","fav_colour":"blue"}
{"name":"Mary","number":"010","fav_colour":"yellow"}
{"name":"Anna","number":"007","fav_colour":"pink"}
{"name":"Dave","number":"003","fav_colour":"purple"}
"""
Note, I also tried separating with commas and still it did not work.
I am just wondering if I have to write a function to separate each {bracketed entry} or if there is some functionality of the JSON library that I am missing. So far, when I pass in my file it returns None instead of Some(valid information).
Thanks!
You dont have a valid Json file. This would be valid:
[
{"name":"John","number":"005","fav_colour":"blue"},
{"name":"Mary","number":"010","fav_colour":"yellow"},
{"name":"Anna","number":"007","fav_colour":"pink"},
{"name":"Dave","number":"003","fav_colour":"purple"}
]
Result:
Some(List(Map(name -> John, number -> 005, fav_colour -> blue), Map(name -> Mary, number -> 010, fav_colour -> yellow), Map(name -> Anna, number -> 007, fav_colour -> pink), Map(name -> Dave, number -> 003, fav_colour -> purple)))
http://www.scalakata.com/522bdbfeebb25c7f5d823c7d
The format you use is convenient for gathering information over time, e.g. keeping logs.
You can parse it by reusing the parser combinators!
For example:
import scala.util.parsing.json.JSON
val parseResult = JSON.rep1(JSON.root)(new JSON.lexical.Scanner("{\"a\": 1} {\"b\": 2}"))
parseResult match {case JSON.Success (result, _) => result; case _ => Nil}
returns
List({"a" : 1.0}, {"b" : 2.0})